Generate sphinx docu from docstrings not working - python

I have a project with the following structure (which I would like to keep):
my_project
├── build # here is where sphinx should dump into
├── requirements.txt
├── make.bat
├── Makefile
├── ... # more config files
├── doc # this is where I want sphinx files to live
│   ├── conf.py
│   └── index.rst
├── src
│   └── my_project
│ ├── __init__.py
│   ├── module_1
│   │ ├── __init__.py
│   │ └── ...
│   └── util
│   ├── __init__.py
│   └── ...
└── tests
├── module_1
│ ├── __init__.py
│ └── ... # testing module 1
└── util
├── __init__.py
└── ... # testing util stuff
I recreated it on github, which can be used to recreate the results by executing my_setup.sh within.
I want to build the documentation from docstrings. I used sphinx's quickstart to generate necessary config, but when I call make hmtl, the resulting docu doesn't include any docstrings from my source code, i.e. everything in my_project/src/my_project. Sphinx's documenation is a bit overwhelming, given that I feel that I am trying to set up something very basic.
Relevant info from config files (please tell me if I forgot something important):
Makefile
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = my_project
SOURCEDIR = doc
BUILDDIR = build
...
make.bat
set SOURCEDIR=doc
set BUILDDIR=build
set SPHINXPROJ=my_project
...
conf.py
import os
import sys
sys.path.insert(0, os.path.abspath('../src/my_project'))
...
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.todo',
'sphinx.ext.coverage',
]
...
I tried this as well, but it first put a bunch of build files into doc that I'd rather not have there and it also didn't find any of the modules (fixed by omitting the -F parameter):
$ sphinx-apidoc -F -o doc/ src/my_project/
$ cd doc
$ make html
Running Sphinx v1.7.2
loading pickled environment... done
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 0 source files that are out of date
updating environment: 0 added, 2 changed, 0 removed
reading sources... [100%] my_project.util
WARNING: autodoc: failed to import module 'my_project'; the following exception was raised:
No module named 'my_project'
WARNING: autodoc: failed to import module 'my_project.util.test_file'; the following exception was raised:
No module named 'my_project'
WARNING: autodoc: failed to import module 'my_project.util'; the following exception was raised:
No module named 'my_project'
looking for now-outdated files... none found
pickling environment... done
checking consistency... /home/arne/workspace/git/my_project/doc/my_project.rst: WARNING: document isn\'t included in any toctree
done
preparing documents... done
writing output... [100%] my_project.util
generating indices... genindex
writing additional pages... search
copying static files... done
copying extra files... done
dumping search index in English (code: en) ... done
dumping object inventory... done
build succeeded, 4 warnings.

There are a couple of issues with your MCVE.
rST source files must not reside in the output directory build, and should be in the docs source directory docs. You should have done this instead: sphinx-apidoc -o docs src/my_project.
As #mzjn mentioned, you need to uncomment and add some lines to your conf.py to resolve the WARNING: autodoc: failed to import module errors.
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
import sys
# sys.path.insert(0, os.path.abspath('.'))
sys.path.insert(0, os.path.abspath('../src/'))
After those two changes, I was able to successfully build your docs with its API.

Related

Do we need sconscript file in every source directory

I am using scons to compile my project.
In my project source files are in different directories.
Do we need sconscript file in every directory to compile those project source files?
I tried to compile all directories with the single sconscript file. But all object files are adding to my source directory only.
I am using this function:
env.Library('libs',files_list)
If files_list contains the only file names then Obj files are generating # variant directory.
If files_list contains the file path names then Obj files are generating # source directory.
Can you tell me how to do this?
I prepared an example that shows how to compile a project like yours with just one SConstruct script (no subsidiary SConscripts) using the SCons VariantDir() function. I decided to do this in a separate answer so that it would be easier to read.
The VariantDir() function isnt documented very well, so the behavior you mention regarding the placement of the compiled object files isnt straight-forward to fix. The "trick" is to refer to all of your source files in the variant directory, not in your actual source directory, as can be seen below.
Here is the structure of the source files in my project:
$ tree .
.
├── SConstruct
├── src1
│   ├── class1.cc
│   └── class1.h
├── src2
│   ├── class2.cc
│   └── class2.h
└── srcMain
└── main.cc
Here is the SConstruct:
env = Environment()
# Set the include paths
env.Append(CPPPATH = ['src1', 'src2'])
# Notice the source files are referred to in the build dir
# If you dont do this, the compiled objects will be in the src dirs
src1Sources = ['build/lib1/class1.cc']
src2Sources = ['build/lib2/class2.cc']
mainSources = ['build/mainApp/main.cc']
env.VariantDir(variant_dir = 'build/lib1', src_dir = 'src1', duplicate = 0)
env.VariantDir(variant_dir = 'build/lib2', src_dir = 'src2', duplicate = 0)
env.VariantDir(variant_dir = 'build/mainApp', src_dir = 'srcMain', duplicate = 0)
lib1 = env.Library(target = 'build/lib1/src1', source = src1Sources)
lib2 = env.Library(target = 'build/lib1/src2', source = src2Sources)
env.Program(target = 'build/mainApp/main', source = [mainSources, lib1, lib2])
Here is the compilation output:
$ scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
g++ -o build/lib1/class1.o -c -Isrc1 -Isrc2 src1/class1.cc
ar rc build/lib1/libsrc1.a build/lib1/class1.o
ranlib build/lib1/libsrc1.a
g++ -o build/lib2/class2.o -c -Isrc1 -Isrc2 src2/class2.cc
ar rc build/lib1/libsrc2.a build/lib2/class2.o
ranlib build/lib1/libsrc2.a
g++ -o build/mainApp/main.o -c -Isrc1 -Isrc2 srcMain/main.cc
g++ -o build/mainApp/main build/mainApp/main.o build/lib1/libsrc1.a build/lib1/libsrc2.a
scons: done building targets.
And here is the resulting project structure after compiling:
$ tree .
.
├── build
│   ├── lib1
│   │   ├── class1.o
│   │   ├── libsrc1.a
│   │   └── libsrc2.a
│   ├── lib2
│   │   └── class2.o
│   └── mainApp
│   ├── main
│   └── main.o
├── SConstruct
├── src1
│   ├── class1.cc
│   └── class1.h
├── src2
│   ├── class2.cc
│   └── class2.h
└── srcMain
└── main.cc
It should be mentioned that a more straight-forward way to do this is with the SConscript() function, specifying the variant_dir, but if your requirements dont allow you to do so, this example will work. The SCons man page has more info about the VariantDir() function. There you will also find the following:
Note that VariantDir() works most naturally with a subsidiary SConscript file.
To answer your first question: No, its not necessary to have a SConscript in every src sub-directory to be able to compile the files in that directory. Everything can be done from one single SConstruct.
Having said that, its often-times considered to be cleaner and better organized to have a SConscript in ever src sub-directory. Typically in this situation, the root SConstruct would setup things that are common to the entire project and orchestrate calling into the src sub-directories. Then, the SConstruct in each of the src subdirs would focus on the particulars of that subdir. I prefer this approach, as its more modular. Additionally, this would allow you to call the same src subdir SConstruct with different environments to compile different versions of the same code, like debug and release.
All of this can be done by creating an environment in the SConstruct, and then passing it to the sudirs with the SConscript() function. Here's an example:
SConstruct
env = Environment()
env.Append(CPPPATH = '/some/dir/common/to/all')
SConscript('src/subdirA/SConscript',
variant_dir = 'build/subdirA',
duplicate = 0,
exports = 'env')
SConscript('src/subdirB/SConscript',
variant_dir = 'build/subdirB',
duplicate = 0,
exports = 'env')
src/subdirA/SConscript
Import('env')
# If you need to add specific things to the env, then you should clone it,
# else the changes will be seen in other subdirs: clonedEnv = env.Clone()
# No need to specify the path to the source files if all source files are in
# the same dir as this SConscript.
env.Library(target='subdirA', source='fileA.cc')
src/subdirB/SConscript
Import('env')
# If you need to add specific things to the env, then you should clone it,
# else the changes will be seen in other subdirs: clonedEnv = env.Clone()
env.Library(target='subdirB', source='fileB.cc')
As for the last questions, I really dont understand what you're looking for, but using the option I explained above, the resulting compiled targets will always be placed in the VariantDir.

Python Import modules 1 level above. Without using Sys.path

Update: I have changed my file directory
I have a directory structure as follows and I would like to import a module in a parent directory.
**project**/
__init__.py
main.py
**APP_NAME**/
**parser**/
__init__.py
parser.py
**test**/
__init__.py
parser_test.py
parser.py
class Parser(object):
pass
main.py (Works fine)
from APP_NAME.parser.parser import Parser
parser_test.py (Throws error)
from ..APP_NAME.parser.parser import Parser
Throws the following error at parser_test.py
Parent module '' not loaded, cannot perform relative import
I know I can fix it using sys.path.append(), but I want to import it like a package the way I did it in main.py.
Any help is appreciated. Thanks.
I had to check back at one of my projects for a reference.
To test files in the tests folder you must first create setup.py, so that you can install you project for python to use it.
If on linux use the command, sudo python setup.py install to install the package. When changes have been made to the project, you must install again for the changes to take place.
These folder will be created in your root project directory after installing.
build, dist, and project.egg-info.
You may need to clean the build directory before re-installing to update.
python setup.py clean
python setup.py build
python setup.py install
Project Structure
project
├── setup.py
├── tests
│ └── parser_test.py
│
└── project
   ├── __init__.py
   ├── __init__.pyc
   ├── main.py
   └── parser
   ├── __init__.py
   ├── __init__.pyc
   ├── parser.py
   └── parser.pyc
project/setup.py
from setuptools import setup
# Make sure the project name will not conflict with other libraries
# For example do not name the project, 'os', 'sys', ect.
setup(
name='project',
description='My project description',
author='your_online_name',
license='MIT', # Check out software licenses
packages=['project', 'tests']
)
project/tests/parser_test.py
from project.parser import Parser
parser = Parser()
project/project/__init__.py
from . import parser
project/project/parser/__init__.py
from .parser import Parser
project/project/parser/parser.py
class Parser(object):
pass
You shouldn't be using absolute import within your package. In-package imports should be done with relative imports this way:
parser_test.py
from ..parser.parser import Parser
With relative imports in Python, the first point refers to the file's directory and each extra point refers to the parent directory.
In this case, you would be pointing to the project/parser/parser.py file which from test_parser.py standpoint's is ../parser.py
If you are using Python 2, you should add the following line at the top of all the files in your parser package
from __future__ import absolute_import
This will avoid that you use absolute imports inside you package files by mistake.
Still assuming you are working with Python 2, you should also import unicode_literals for native unicode support and print_function to replace the print command by the print() function.
However, I would rather have my tests in the top folder of the package, which, assuming the package is called project and not parser, would give the following directory structure:
project/ # top project directory
├── main.py
└── project # top package directory
├── __init__.py # this file is required even if it is empty
├── parser
│   ├── __init__.py
│   └── parser.py
└── tests
└── test_parser.py
Also, the project/project/parser/__init__.py could contain the following:
from .parser import Parser
So that your main.py file could import the Parser class like this:
from project.parser import Parser
instead of the more tedious:
from project.parser.parser import Parser
Your test_parser.py file, however, will still have to import the Parser class like this:
from ..parser.parser import Parser
because the classes exposed in an __init__.py file are not available to relative imports.
Finally, if you are starting a new independent project, you should do it in Python 3 (that's a PEP recommendation), where all the above rules apply, except the from __future__ imports which are unnecessary.
Sources: https://axialcorps.wordpress.com/2013/08/29/5-simple-rules-for-building-great-python-packages/

Sphinx autodoc - documenting classes across multiple subpackages of a repository

I am trying to use Sphinx (1.5.3) to create documentation for a Python repo, with the following structure:
main_repo/
├── docs
├── src
| ├── subrepo1
| ├── subrepo2
| └── subrepo3
└── tests
The Python code is structured into three different Python packages inside src. Here subrepo1 is just part of main_repo, but subrepo2 and subrepo3 are Git submodules linking to independent Git repos. The Sphinx files live in the docs folder:
docs
├── _build
├── conf.py
├── index.rst
├── make.bat
├── Makefile
├── _static
└── _templates
In conf.py I have used
sys.path.insert(0, os.path.abspath(os.path.join('path/to/src')))
to add the full path of the src directory to sys.path, and in index.rst I have added the following directives to generate documentation for one class in the subrepo1 package:
.. toctree::
:maxdepth: 2
:caption: Contents:
.. automodule:: subrepo1.module
.. autoclass:: Class1
:members:
Here module is the name of the module module.py inside the package subrepo1 and Class1 is a class inside module.
I am getting the following error when I run make html:
/path/to/src/docs/index.rst:13: WARNING: autodoc: failed to import module u'subrepo1.module'; the module executes module level statement and it might call sys.exit().
/path/to/src/docs/index.rst:15: WARNING: don't know which module to import for autodocumenting u'Class1' (try placing a "module" or "currentmodule" directive in the document, or giving an explicit module name)
I want to generate docs for all classes in all submodules and subpackages, including the class constructors and all "public" methods and attributes, in these three subrepos inside main_repo. What are the directives to do this in index.rst?

Python: include a third party library in a personal Python package

I would like to include a third party library into my Python script folder to distribute it all togehter (I am awary of the distribution license, and this library is fine to distribute). This is in order to avoid installing the library on another machine.
Saying I have a script (my_script.py), which calls this external library. I tried to copy this library from the site-packages subdirectory of Python directory into the directory where I have my files, but it seems not to be enough (I think th reason is in the __init__.py of this library which probably needs the folder to be in the PYTHONPATH).
Would it be reasonable to insert some lines of code in my_script.py to temporary append its folder to sys.path in order to make the all things working?
For instance, if I have a structure similar to this:
Main_folder
my_script.py
/external_lib_folder
__init__.py
external_lib.py
and external_lib_folder is the external library I copied from site-packages and inserted in my Main_folder, would it be fine if I write these lines (e.g.) in my_script.py?
import os,sys
main_dir = os.path.dirname(os.path.abspath(__file__))
sys.path.append(main_dir)
EDIT
I ended up choosing the sys.path.append solution. I added these lines to my my_script.py:
import os, sys
# temporarily appends the folder containing this file into sys.path
main_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)),'functions')
sys.path.append(main_dir)
Anyway, I chose to insert this as an edit in my question and accept the answer of Torxed because of the time he spent in helping me (and of course because his solution works as well).
Python3
import importlib.machinery, imp
namespace = 'external_lib'
loader = importlib.machinery.SourceFileLoader(namespace, '/home/user/external_lib_folder/external_lib.py')
external_lib = loader.load_module(namespace)
# How to use it:
external_lib.function(data_or_something)
This would be an ideal way to load custom paths in Python 3.
Not entirely sure this is what you wanted but It's relevant enough to post an alternative to adding to sys.path.
Python2
In python 2 you could just do (if i'm not mistaken, been a while since i used an older version of Python):
external_lib = __import__('external_lib_folder')
This does however require you to keep the __init__.py and a proper declaration of functions in sad script, otherwise it will fail.
**It's also important that the folder you're trying to import from is of the same name that the __init__.py script in sad folder is trying to import it's sub-libraries from, for instance geopy would be:
./myscript.py
./geopy/
./geopy/__init__.py
./geopy/compat.py
...
And the code of myscript.py would look like this:
handle = __import__('geopy')
print(handle)
Which would produce the following output:
[user#machine project]$ python2 myscript.py
<module 'geopy' from '/home/user/project/geopy/__init__.pyc'>
[user#machine project]$ tree -L 2
.
├── geopy
│   ├── compat.py
│   ├── compat.pyc
│   ├── distance.py
│   ├── distance.pyc
│   ├── exc.py
│   ├── exc.pyc
│   ├── format.py
│   ├── format.pyc
│   ├── geocoders
│   ├── __init__.py
│   ├── __init__.pyc
│   ├── location.py
│   ├── location.pyc
│   ├── point.py
│   ├── point.pyc
│   ├── units.py
│   ├── units.pyc
│   ├── util.py
│   ├── util.pyc
│   └── version.pyc
└── myscript.py
2 directories, 20 files
Because in __init__.py of geopy, it's defined imports such as from geopy.point import Point which requires a namespace or a folder of geopy to be present.
There for you can't rename the folder to functions and place a folder called geopy in there because that won't work, nor will placing the contents of geopy in a folder called functions because that's not what geopy will look for.
Adding the path to sys.path (Py2 + 3)
As discussed in the comments, you can also add the folder to your sys.path variable prior to imports.
import sys
sys.path.insert(0, './functions')
import geopy
print(geopy)
>>> <module 'geopy' from './functions/geopy/__init__.pyc'>
Why this is a bad idea: It will work, and is used by many. The problems that can occur is that you might replace system functions or other modules might get loaded from other folders if you're not careful where you import stuff from. There for use .insert(0, ...) for most and be sure you actually want to risk replacing system built-ins with "shady" path names.
What you suggest is bad practice, it is a weak arrangement. The best solution (which is also easy to do) is to package it properly and add an explicit dependency, like this:
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
install_requires=[
'markdown',
],
zip_safe=False)
This will work if the third party library is on pipy. If it's not, use this:
setup(
...
dependency_links=['http://github.com/user/repo/tarball/master#egg=package-1.0']
...
)
(See this explanation for packaging).

Python setup.py script isn't installing modules correctly

Repo location: https://github.com/willkara/SakaiPy
So I have this python modudle I'm creating. It currently has this structure:
SakaiPy
├── SakaiPy
│   ├── __init__.py #1
│   └── RequestGenerator.py
├── SakaiTools
├── __init__.py #2
├── Assignment.py
├── Announcement.py
└── ...etc.py
└── setup.py
init.py #1 looks like:
__all__=['SakaiTools']
from SakaiTools import *
init.py #2 is empty
My setup.py looks like:
version='1.0',
description='Python interface to the Sakai RESTful API\'s',
license='MIT',
author='William Karavites',
author_email='wkaravites#gmail.com',
url='https://github.com/willkara/SakaiPy',
packages=['SakaiPy','SakaiPy/SakaiTools'],
requires={
"mechanize",
"cookielib",
"requests",
"simplejson"}
)
My problem is that the module seems to be building incorrectly.
When I try and use the module like this:
#!/usr/bin/python
# -*- coding: utf-8 -*-
from SakaiPy import *
print "hello"
authInfo={}
authInfo['baseURL'] =""
authInfo['loginURL']=""
authInfo['username']=""
authInfo['password']=""
rq = RequestGenerator.RequestGenerator(authInfo)
I get this error:
Traceback (most recent call last):
File "../sakaiTest.py", line 14, in <module>
rq = RequestGenerator.RequestGenerator(authInfo)
NameError: name 'RequestGenerator' is not defined
My guess is that my setup.py and init.py scripts are setup incorrectly.
You are going to want to change your directory stucture, as right now you technically have two Python modules and you are giving setuptools an incorrect package path. In order to get the path you are looking for, you are going to need to nest the SakaiTools directory within the SakaiPy directory. With this, you should be able to have the import you are looking for, and you can import SakaiTools as SakiPy.SakaiTools like you appear to be try to do.
SakaiPy
├── SakaiPy
│ ├── __init__.py # make this blank
│ ├── RequestGenerator.py
│ └── SakaiTools
│ ├── __init__.py # keep it blank
│ ├── Assignment.py
│ ├── Announcement.py
│ └── ...etc.py
└── setup.py
This will give you a single module with SakaiTools as a submodule, which sounds like what you are looking for. You are going to need to remve you SakaiTools imports from the first __init__.py, as you will be able to access those imports just fine with this setup.
If you are looking to keep the two different modules, you are going to need to tell setuptools that you have two different modules.
version='1.0',
description='Python interface to the Sakai RESTful API\'s',
license='MIT',
author='William Karavites',
author_email='wkaravites#gmail.com',
url='https://github.com/willkara/SakaiPy',
packages=['SakaiPy','SakaiTools'],
requires=(
"mechanize",
"cookielib",
"requests",
"simplejson",
)

Resources