Generate sphinx docu from docstrings not working - python

I have a project with the following structure (which I would like to keep):
├── build # here is where sphinx should dump into
├── requirements.txt
├── make.bat
├── Makefile
├── ... # more config files
├── doc # this is where I want sphinx files to live
│   ├──
│   └── index.rst
├── src
│   └── my_project
│ ├──
│   ├── module_1
│   │ ├──
│   │ └── ...
│   └── util
│   ├──
│   └── ...
└── tests
├── module_1
│ ├──
│ └── ... # testing module 1
└── util
└── ... # testing util stuff
I recreated it on github, which can be used to recreate the results by executing within.
I want to build the documentation from docstrings. I used sphinx's quickstart to generate necessary config, but when I call make hmtl, the resulting docu doesn't include any docstrings from my source code, i.e. everything in my_project/src/my_project. Sphinx's documenation is a bit overwhelming, given that I feel that I am trying to set up something very basic.
Relevant info from config files (please tell me if I forgot something important):
SPHINXBUILD = sphinx-build
SPHINXPROJ = my_project
BUILDDIR = build
set BUILDDIR=build
set SPHINXPROJ=my_project
import os
import sys
sys.path.insert(0, os.path.abspath('../src/my_project'))
extensions = [
I tried this as well, but it first put a bunch of build files into doc that I'd rather not have there and it also didn't find any of the modules (fixed by omitting the -F parameter):
$ sphinx-apidoc -F -o doc/ src/my_project/
$ cd doc
$ make html
Running Sphinx v1.7.2
loading pickled environment... done
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 0 source files that are out of date
updating environment: 0 added, 2 changed, 0 removed
reading sources... [100%] my_project.util
WARNING: autodoc: failed to import module 'my_project'; the following exception was raised:
No module named 'my_project'
WARNING: autodoc: failed to import module 'my_project.util.test_file'; the following exception was raised:
No module named 'my_project'
WARNING: autodoc: failed to import module 'my_project.util'; the following exception was raised:
No module named 'my_project'
looking for now-outdated files... none found
pickling environment... done
checking consistency... /home/arne/workspace/git/my_project/doc/my_project.rst: WARNING: document isn\'t included in any toctree
preparing documents... done
writing output... [100%] my_project.util
generating indices... genindex
writing additional pages... search
copying static files... done
copying extra files... done
dumping search index in English (code: en) ... done
dumping object inventory... done
build succeeded, 4 warnings.

There are a couple of issues with your MCVE.
rST source files must not reside in the output directory build, and should be in the docs source directory docs. You should have done this instead: sphinx-apidoc -o docs src/my_project.
As #mzjn mentioned, you need to uncomment and add some lines to your to resolve the WARNING: autodoc: failed to import module errors.
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
import os
import sys
# sys.path.insert(0, os.path.abspath('.'))
sys.path.insert(0, os.path.abspath('../src/'))
After those two changes, I was able to successfully build your docs with its API.


Do we need sconscript file in every source directory

I am using scons to compile my project.
In my project source files are in different directories.
Do we need sconscript file in every directory to compile those project source files?
I tried to compile all directories with the single sconscript file. But all object files are adding to my source directory only.
I am using this function:
If files_list contains the only file names then Obj files are generating # variant directory.
If files_list contains the file path names then Obj files are generating # source directory.
Can you tell me how to do this?
I prepared an example that shows how to compile a project like yours with just one SConstruct script (no subsidiary SConscripts) using the SCons VariantDir() function. I decided to do this in a separate answer so that it would be easier to read.
The VariantDir() function isnt documented very well, so the behavior you mention regarding the placement of the compiled object files isnt straight-forward to fix. The "trick" is to refer to all of your source files in the variant directory, not in your actual source directory, as can be seen below.
Here is the structure of the source files in my project:
$ tree .
├── SConstruct
├── src1
│   ├──
│   └── class1.h
├── src2
│   ├──
│   └── class2.h
└── srcMain
Here is the SConstruct:
env = Environment()
# Set the include paths
env.Append(CPPPATH = ['src1', 'src2'])
# Notice the source files are referred to in the build dir
# If you dont do this, the compiled objects will be in the src dirs
src1Sources = ['build/lib1/']
src2Sources = ['build/lib2/']
mainSources = ['build/mainApp/']
env.VariantDir(variant_dir = 'build/lib1', src_dir = 'src1', duplicate = 0)
env.VariantDir(variant_dir = 'build/lib2', src_dir = 'src2', duplicate = 0)
env.VariantDir(variant_dir = 'build/mainApp', src_dir = 'srcMain', duplicate = 0)
lib1 = env.Library(target = 'build/lib1/src1', source = src1Sources)
lib2 = env.Library(target = 'build/lib1/src2', source = src2Sources)
env.Program(target = 'build/mainApp/main', source = [mainSources, lib1, lib2])
Here is the compilation output:
$ scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
g++ -o build/lib1/class1.o -c -Isrc1 -Isrc2 src1/
ar rc build/lib1/libsrc1.a build/lib1/class1.o
ranlib build/lib1/libsrc1.a
g++ -o build/lib2/class2.o -c -Isrc1 -Isrc2 src2/
ar rc build/lib1/libsrc2.a build/lib2/class2.o
ranlib build/lib1/libsrc2.a
g++ -o build/mainApp/main.o -c -Isrc1 -Isrc2 srcMain/
g++ -o build/mainApp/main build/mainApp/main.o build/lib1/libsrc1.a build/lib1/libsrc2.a
scons: done building targets.
And here is the resulting project structure after compiling:
$ tree .
├── build
│   ├── lib1
│   │   ├── class1.o
│   │   ├── libsrc1.a
│   │   └── libsrc2.a
│   ├── lib2
│   │   └── class2.o
│   └── mainApp
│   ├── main
│   └── main.o
├── SConstruct
├── src1
│   ├──
│   └── class1.h
├── src2
│   ├──
│   └── class2.h
└── srcMain
It should be mentioned that a more straight-forward way to do this is with the SConscript() function, specifying the variant_dir, but if your requirements dont allow you to do so, this example will work. The SCons man page has more info about the VariantDir() function. There you will also find the following:
Note that VariantDir() works most naturally with a subsidiary SConscript file.
To answer your first question: No, its not necessary to have a SConscript in every src sub-directory to be able to compile the files in that directory. Everything can be done from one single SConstruct.
Having said that, its often-times considered to be cleaner and better organized to have a SConscript in ever src sub-directory. Typically in this situation, the root SConstruct would setup things that are common to the entire project and orchestrate calling into the src sub-directories. Then, the SConstruct in each of the src subdirs would focus on the particulars of that subdir. I prefer this approach, as its more modular. Additionally, this would allow you to call the same src subdir SConstruct with different environments to compile different versions of the same code, like debug and release.
All of this can be done by creating an environment in the SConstruct, and then passing it to the sudirs with the SConscript() function. Here's an example:
env = Environment()
env.Append(CPPPATH = '/some/dir/common/to/all')
variant_dir = 'build/subdirA',
duplicate = 0,
exports = 'env')
variant_dir = 'build/subdirB',
duplicate = 0,
exports = 'env')
# If you need to add specific things to the env, then you should clone it,
# else the changes will be seen in other subdirs: clonedEnv = env.Clone()
# No need to specify the path to the source files if all source files are in
# the same dir as this SConscript.
env.Library(target='subdirA', source='')
# If you need to add specific things to the env, then you should clone it,
# else the changes will be seen in other subdirs: clonedEnv = env.Clone()
env.Library(target='subdirB', source='')
As for the last questions, I really dont understand what you're looking for, but using the option I explained above, the resulting compiled targets will always be placed in the VariantDir.

Python Import modules 1 level above. Without using Sys.path

Update: I have changed my file directory
I have a directory structure as follows and I would like to import a module in a parent directory.
class Parser(object):
pass (Works fine)
from APP_NAME.parser.parser import Parser (Throws error)
from ..APP_NAME.parser.parser import Parser
Throws the following error at
Parent module '' not loaded, cannot perform relative import
I know I can fix it using sys.path.append(), but I want to import it like a package the way I did it in
Any help is appreciated. Thanks.
I had to check back at one of my projects for a reference.
To test files in the tests folder you must first create, so that you can install you project for python to use it.
If on linux use the command, sudo python install to install the package. When changes have been made to the project, you must install again for the changes to take place.
These folder will be created in your root project directory after installing.
build, dist, and project.egg-info.
You may need to clean the build directory before re-installing to update.
python clean
python build
python install
Project Structure
├── tests
│ └──
└── project
   ├── __init__.pyc
   └── parser
   ├── __init__.pyc
   └── parser.pyc
from setuptools import setup
# Make sure the project name will not conflict with other libraries
# For example do not name the project, 'os', 'sys', ect.
description='My project description',
license='MIT', # Check out software licenses
packages=['project', 'tests']
from project.parser import Parser
parser = Parser()
from . import parser
from .parser import Parser
class Parser(object):
You shouldn't be using absolute import within your package. In-package imports should be done with relative imports this way:
from ..parser.parser import Parser
With relative imports in Python, the first point refers to the file's directory and each extra point refers to the parent directory.
In this case, you would be pointing to the project/parser/ file which from standpoint's is ../
If you are using Python 2, you should add the following line at the top of all the files in your parser package
from __future__ import absolute_import
This will avoid that you use absolute imports inside you package files by mistake.
Still assuming you are working with Python 2, you should also import unicode_literals for native unicode support and print_function to replace the print command by the print() function.
However, I would rather have my tests in the top folder of the package, which, assuming the package is called project and not parser, would give the following directory structure:
project/ # top project directory
└── project # top package directory
├── # this file is required even if it is empty
├── parser
│   ├──
│   └──
└── tests
Also, the project/project/parser/ could contain the following:
from .parser import Parser
So that your file could import the Parser class like this:
from project.parser import Parser
instead of the more tedious:
from project.parser.parser import Parser
Your file, however, will still have to import the Parser class like this:
from ..parser.parser import Parser
because the classes exposed in an file are not available to relative imports.
Finally, if you are starting a new independent project, you should do it in Python 3 (that's a PEP recommendation), where all the above rules apply, except the from __future__ imports which are unnecessary.

Sphinx autodoc - documenting classes across multiple subpackages of a repository

I am trying to use Sphinx (1.5.3) to create documentation for a Python repo, with the following structure:
├── docs
├── src
| ├── subrepo1
| ├── subrepo2
| └── subrepo3
└── tests
The Python code is structured into three different Python packages inside src. Here subrepo1 is just part of main_repo, but subrepo2 and subrepo3 are Git submodules linking to independent Git repos. The Sphinx files live in the docs folder:
├── _build
├── index.rst
├── make.bat
├── Makefile
├── _static
└── _templates
In I have used
sys.path.insert(0, os.path.abspath(os.path.join('path/to/src')))
to add the full path of the src directory to sys.path, and in index.rst I have added the following directives to generate documentation for one class in the subrepo1 package:
.. toctree::
:maxdepth: 2
:caption: Contents:
.. automodule:: subrepo1.module
.. autoclass:: Class1
Here module is the name of the module inside the package subrepo1 and Class1 is a class inside module.
I am getting the following error when I run make html:
/path/to/src/docs/index.rst:13: WARNING: autodoc: failed to import module u'subrepo1.module'; the module executes module level statement and it might call sys.exit().
/path/to/src/docs/index.rst:15: WARNING: don't know which module to import for autodocumenting u'Class1' (try placing a "module" or "currentmodule" directive in the document, or giving an explicit module name)
I want to generate docs for all classes in all submodules and subpackages, including the class constructors and all "public" methods and attributes, in these three subrepos inside main_repo. What are the directives to do this in index.rst?

Python: include a third party library in a personal Python package

I would like to include a third party library into my Python script folder to distribute it all togehter (I am awary of the distribution license, and this library is fine to distribute). This is in order to avoid installing the library on another machine.
Saying I have a script (, which calls this external library. I tried to copy this library from the site-packages subdirectory of Python directory into the directory where I have my files, but it seems not to be enough (I think th reason is in the of this library which probably needs the folder to be in the PYTHONPATH).
Would it be reasonable to insert some lines of code in to temporary append its folder to sys.path in order to make the all things working?
For instance, if I have a structure similar to this:
and external_lib_folder is the external library I copied from site-packages and inserted in my Main_folder, would it be fine if I write these lines (e.g.) in
import os,sys
main_dir = os.path.dirname(os.path.abspath(__file__))
I ended up choosing the sys.path.append solution. I added these lines to my
import os, sys
# temporarily appends the folder containing this file into sys.path
main_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)),'functions')
Anyway, I chose to insert this as an edit in my question and accept the answer of Torxed because of the time he spent in helping me (and of course because his solution works as well).
import importlib.machinery, imp
namespace = 'external_lib'
loader = importlib.machinery.SourceFileLoader(namespace, '/home/user/external_lib_folder/')
external_lib = loader.load_module(namespace)
# How to use it:
This would be an ideal way to load custom paths in Python 3.
Not entirely sure this is what you wanted but It's relevant enough to post an alternative to adding to sys.path.
In python 2 you could just do (if i'm not mistaken, been a while since i used an older version of Python):
external_lib = __import__('external_lib_folder')
This does however require you to keep the and a proper declaration of functions in sad script, otherwise it will fail.
**It's also important that the folder you're trying to import from is of the same name that the script in sad folder is trying to import it's sub-libraries from, for instance geopy would be:
And the code of would look like this:
handle = __import__('geopy')
Which would produce the following output:
[user#machine project]$ python2
<module 'geopy' from '/home/user/project/geopy/__init__.pyc'>
[user#machine project]$ tree -L 2
├── geopy
│   ├──
│   ├── compat.pyc
│   ├──
│   ├── distance.pyc
│   ├──
│   ├── exc.pyc
│   ├──
│   ├── format.pyc
│   ├── geocoders
│   ├──
│   ├── __init__.pyc
│   ├──
│   ├── location.pyc
│   ├──
│   ├── point.pyc
│   ├──
│   ├── units.pyc
│   ├──
│   ├── util.pyc
│   └── version.pyc
2 directories, 20 files
Because in of geopy, it's defined imports such as from geopy.point import Point which requires a namespace or a folder of geopy to be present.
There for you can't rename the folder to functions and place a folder called geopy in there because that won't work, nor will placing the contents of geopy in a folder called functions because that's not what geopy will look for.
Adding the path to sys.path (Py2 + 3)
As discussed in the comments, you can also add the folder to your sys.path variable prior to imports.
import sys
sys.path.insert(0, './functions')
import geopy
>>> <module 'geopy' from './functions/geopy/__init__.pyc'>
Why this is a bad idea: It will work, and is used by many. The problems that can occur is that you might replace system functions or other modules might get loaded from other folders if you're not careful where you import stuff from. There for use .insert(0, ...) for most and be sure you actually want to risk replacing system built-ins with "shady" path names.
What you suggest is bad practice, it is a weak arrangement. The best solution (which is also easy to do) is to package it properly and add an explicit dependency, like this:
from setuptools import setup
description='The funniest joke in the world',
author='Flying Circus',
This will work if the third party library is on pipy. If it's not, use this:
(See this explanation for packaging).

Python script isn't installing modules correctly

Repo location:
So I have this python modudle I'm creating. It currently has this structure:
├── SakaiPy
│   ├── #1
│   └──
├── SakaiTools
├── #2
└── #1 looks like:
from SakaiTools import * #2 is empty
My looks like:
description='Python interface to the Sakai RESTful API\'s',
author='William Karavites',
My problem is that the module seems to be building incorrectly.
When I try and use the module like this:
# -*- coding: utf-8 -*-
from SakaiPy import *
print "hello"
authInfo['baseURL'] =""
rq = RequestGenerator.RequestGenerator(authInfo)
I get this error:
Traceback (most recent call last):
File "../", line 14, in <module>
rq = RequestGenerator.RequestGenerator(authInfo)
NameError: name 'RequestGenerator' is not defined
My guess is that my and scripts are setup incorrectly.
You are going to want to change your directory stucture, as right now you technically have two Python modules and you are giving setuptools an incorrect package path. In order to get the path you are looking for, you are going to need to nest the SakaiTools directory within the SakaiPy directory. With this, you should be able to have the import you are looking for, and you can import SakaiTools as SakiPy.SakaiTools like you appear to be try to do.
├── SakaiPy
│ ├── # make this blank
│ ├──
│ └── SakaiTools
│ ├── # keep it blank
│ ├──
│ ├──
│ └──
This will give you a single module with SakaiTools as a submodule, which sounds like what you are looking for. You are going to need to remve you SakaiTools imports from the first, as you will be able to access those imports just fine with this setup.
If you are looking to keep the two different modules, you are going to need to tell setuptools that you have two different modules.
description='Python interface to the Sakai RESTful API\'s',
author='William Karavites',