graph
– Interface for the Theano graph¶
Reference¶
Core graph classes.

class
theano.graph.basic.
Apply
(op, inputs, outputs)[source]¶ A Node representing the application of an operation to inputs.
An Apply instance serves as a simple structure with three important attributes:
inputs
: a list of Variable nodes that represent the arguments of the expression,outputs
: a list of Variable nodes that represent the computed outputs of the expression, andop
: an Op instance that determines the nature of the expression being applied.
Basically, an Apply instance is an object that represents the Python statement outputs = op(*inputs).
This class is typically instantiated by a Op.make_node method, which is called by Op.__call__.
The function theano.compile.function.function uses Apply.inputs together with Variable.owner to search the expression graph and determine which inputs are necessary to compute the function’s outputs.
A Linker uses the Apply instance’s op field to compute numeric values for the output variables.
Parameters:  op (A Op instance) –
 inputs (list of Variable instances) –
 outputs (list of Variable instances) –
Notes
The Variable.owner field of each Apply.outputs element is set to self in Apply.make_node.
If an output element has an owner that is neither None nor self, then a ValueError exception will be raised.

clone
()[source]¶ Duplicate this Apply instance with inputs = self.inputs.
Returns: A new Apply instance (or subclass instance) with new outputs. Return type: object Notes
Tags are copied from self to the returned instance.

clone_with_new_inputs
(inputs, strict=True)[source]¶ Duplicate this Apply instance in a new graph.
Parameters:  inputs (list of Variables) – List of Variable instances to use as inputs.
 strict (bool) – If
True
, the type fields of all the inputs must be equal to the current ones (or compatible, for instance Tensor / GpuArray of the same dtype and broadcastable patterns, in which case they will be converted into current Type), and returned outputs are guaranteed to have the same types asself.outputs
. IfFalse
, then there’s no guarantee that the clone’s outputs will have the same types asself.outputs
, and cloning may not even be possible (it depends on the Op).
Returns: An Apply instance with the same Op but different outputs.
Return type: object

default_output
()[source]¶ Returns the default output for this node.
Returns: An element of self.outputs, typically self.outputs[0]. Return type: Variable instance Notes
May raise AttributeError self.op.default_output is out of range, or if there are multiple outputs and self.op.default_output does not exist.

class
theano.graph.basic.
Constant
(type, data, name=None)[source]¶ A Variable with a fixed value field.
Constant nodes make numerous optimizations possible (e.g. constant inlining in C code, constant folding, etc.)
Notes
The data field is filtered by what is provided in the constructor for the Constant’s type field.

clone
()[source]¶ We clone this object, but we don’t clone the data to lower memory requirement. We suppose that the data will never change.

get_test_value
()[source]¶ Get the test value.
Raises: TestValueError –


class
theano.graph.basic.
Node
[source]¶ A Node in a Theano graph.
Currently, graphs contain two kinds of Nodes: Variable`s and `Apply`s. Edges in the graph are not explicitly represented. Instead each `Node keeps track of its parents via Variable.owner / Apply.inputs.

class
theano.graph.basic.
Variable
(type, owner=None, index=None, name=None)[source]¶ A Variable is a node in an expression graph that represents a variable.
The inputs and outputs of every Apply (theano.graph.basic.Apply) are Variable instances. The input and output arguments to create a function are also Variable instances. A Variable is like a stronglytyped variable in some other languages; each Variable contains a reference to a Type instance that defines the kind of value the Variable can take in a computation.
A Variable is a container for four important attributes:
type
a Type instance defining the kind of value this Variable can have,owner
either None (for graph roots) or the Apply instance of which self is an output,index
the integer such thatowner.outputs[index] is this_variable
(ignored if owner is None),name
a string to use in prettyprinting and debugging.
There are a few kinds of Variable`s to be aware of: A `Variable which is the output of a symbolic computation has a reference to the Apply instance to which it belongs (property: owner) and the position of itself in the owner’s output list (property: index).
 Variable (this base type) is typically the output of a symbolic computation.
 Constant: a subclass which adds a default and unreplaceable
value
, and requires that owner is None.  TensorVariable subclass of Variable that represents a numpy.ndarray
 object.
 TensorSharedVariable: a shared version of TensorVariable.
 SparseVariable: a subclass of Variable that represents a scipy.sparse.{csc,csr}_matrix object.
 GpuArrayVariable: a subclass of Variable that represents our object on the GPU that is a subset of numpy.ndarray.
 RandomVariable.
A Variable which is the output of a symbolic computation will have an owner not equal to None.
Using the Variables’ owner field and the Apply nodes’ inputs fields, one can navigate a graph from an output all the way to the inputs. The opposite direction is possible with a FunctionGraph and its FunctionGraph.clients dict, which maps `Variable`s to a list of their clients.
Parameters:  type (a Type instance) – The type governs the kind of data that can be associated with this variable.
 owner (None or Apply instance) – The Apply instance which computes the value for this variable.
 index (None or int) – The position of this Variable in owner.outputs.
 name (None or str) – A string for prettyprinting and debugging.
Examples
import theano import theano.tensor as tt a = tt.constant(1.5) # declare a symbolic constant b = tt.fscalar() # declare a symbolic floatingpoint scalar c = a + b # create a simple expression f = theano.function([b], [c]) # this works because a has a value associated with it already assert 4.0 == f(2.5) # bind 2.5 to an internal copy of b and evaluate an internal c theano.function([a], [c]) # compilation error because b (required by c) is undefined theano.function([a,b], [c]) # compilation error because a is constant, it can't be an input
The python variables
a,b,c
all refer to instances of type Variable. The Variable referred to by a is also an instance of Constant.
clone
()[source]¶ Return a new Variable like self.
Returns: A new Variable instance (or subclass instance) with no owner or index. Return type: Variable instance Notes
Tags are copied to the returned instance.
Name is copied to the returned instance.

eval
(inputs_to_values=None)[source]¶ Evaluates this variable.
Parameters: inputs_to_values – A dictionary mapping theano Variables to values. Examples
>>> import numpy as np >>> import theano.tensor as tt >>> x = tt.dscalar('x') >>> y = tt.dscalar('y') >>> z = x + y >>> np.allclose(z.eval({x : 16.3, y : 12.1}), 28.4) True
We passed
eval()
a dictionary mapping symbolic theano variables to the values to substitute for them, and it returned the numerical value of the expression.Notes
eval will be slow the first time you call it on a variable – it needs to call
function()
to compile the expression behind the scenes. Subsequent calls toeval()
on that same variable will be fast, because the variable caches the compiled function.This way of computing has more overhead than a normal Theano function, so don’t use it too much in real scripts.

get_parents
()[source]¶ Return a list of the parents of this node. Should return a copy–i.e., modifying the return value should not modify the graph structure.

get_test_value
()[source]¶ Get the test value.
Raises: TestValueError –

theano.graph.basic.
ancestors
(graphs: Iterable[theano.graph.basic.Variable], blockers: Optional[Collection[theano.graph.basic.Variable]] = None) → Generator[theano.graph.basic.Variable, None, None][source]¶ Return the variables that contribute to those in given graphs (inclusive).
Parameters:  graphs (list of Variable instances) – Output Variable instances from which to search backward through owners.
 blockers (list of Variable instances) – A collection of `Variable`s that, when found, prevent the graph search from preceding from that point.
Yields: Variable`s – All input nodes, in the order found by a leftrecursive depthfirst search started at the nodes in `graphs.

theano.graph.basic.
applys_between
(ins: Collection[theano.graph.basic.Variable], outs: Iterable[theano.graph.basic.Variable]) → Generator[theano.graph.basic.Apply, None, None][source]¶ Extract the `Apply`s contained within the subgraph between given input and output variables.
Parameters: Yields: Apply`s – The `Apply`s that are contained within the subgraph that lies between `ins and outs, including the owners of the Variable`s in `outs and intermediary Apply`s between `ins and outs, but not the owners of the Variable`s in `ins.

theano.graph.basic.
as_string
(inputs: List[theano.graph.basic.Variable], outputs: List[theano.graph.basic.Variable], leaf_formatter=<class 'str'>, node_formatter=<function default_node_formatter>) → List[str][source]¶ Returns a string representation of the subgraph between inputs and outputs.
Parameters:  inputs (list) – Input Variable s.
 outputs (list) – Output Variable s.
 leaf_formatter (callable) – Takes a Variable and returns a string to describe it.
 node_formatter (callable) – Takes an Op and the list of strings corresponding to its arguments and returns a string to describe it.
Returns: Returns a string representation of the subgraph between inputs and outputs. If the same node is used by several other nodes, the first occurrence will be marked as
*n > description
and all subsequent occurrences will be marked as*n
, where n is an id number (ids are attributed in an unspecified order and only exist for viewing convenience).Return type: list of str

theano.graph.basic.
clone
(inputs: Collection[theano.graph.basic.Variable], outputs: Collection[theano.graph.basic.Variable], copy_inputs: bool = True, copy_orphans: Optional[bool] = None) → Tuple[Collection[theano.graph.basic.Variable], Collection[theano.graph.basic.Variable]][source]¶ Copies the subgraph contained between inputs and outputs.
Parameters:  inputs (list) – Input Variables.
 outputs (list) – Output Variables.
 copy_inputs (bool) – If True, the inputs will be copied (defaults to True).
 copy_orphans – When None, use the copy_inputs value, When True, new orphans nodes are created. When False, original orphans nodes are reused in the new graph.
Returns: The inputs and outputs of that copy.
Return type: object
Notes
A constant, if in the inputs list is not an orphan. So it will be copied depending of the copy_inputs parameter. Otherwise it will be copied depending of the copy_orphans parameter.

theano.graph.basic.
clone_get_equiv
(inputs: Collection[theano.graph.basic.Variable], outputs: Collection[theano.graph.basic.Variable], copy_inputs: bool = True, copy_orphans: bool = True, memo: Optional[Dict[theano.graph.basic.Variable, theano.graph.basic.Variable]] = None)[source]¶ Return a dictionary that maps from Variable and Apply nodes in the original graph to a new node (a clone) in a new graph.
This function works by recursively cloning inputs… rebuilding a directed graph from the inputs up to eventually building new outputs.
Parameters:  inputs (a list of Variables) –
 outputs (a list of Variables) –
 copy_inputs (bool) – True means to create the cloned graph from new input nodes (the bottom of a feedupward graph). False means to clone a graph that is rooted at the original input nodes.
 copy_orphans – When True, new constant nodes are created. When False, original constant nodes are reused in the new graph.
 memo (None or dict) – Optionally start with a partlyfilled dictionary for the return value. If a dictionary is passed, this function will work inplace on that dictionary and return it.

theano.graph.basic.
clone_replace
(output: Collection[theano.graph.basic.Variable], replace: Optional[Dict[theano.graph.basic.Variable, theano.graph.basic.Variable]] = None, strict: bool = True, share_inputs: bool = True) → Collection[theano.graph.basic.Variable][source]¶ Clone a graph and replace subgraphs within it.
It returns a copy of the initial subgraph with the corresponding substitutions.
Parameters:  output (Theano Variables (or Theano expressions)) – Theano expression that represents the computational graph.
 replace (dict) – Dictionary describing which subgraphs should be replaced by what.
 share_inputs (bool) – If True, use the same inputs (and shared variables) as the original graph. If False, clone them. Note that cloned shared variables still use the same underlying storage, so they will always have the same value.

theano.graph.basic.
equal_computations
(xs, ys, in_xs=None, in_ys=None)[source]¶ Checks if Theano graphs represent the same computations.
The two lists xs, ys should have the same number of entries. The function checks if for any corresponding pair (x,y) from zip(xs,ys) x and y represent the same computations on the same variables (unless equivalences are provided using in_xs, in_ys).
If in_xs and in_ys are provided, then when comparing a node x with a node y they are automatically considered as equal if there is some index i such that x == in_xs[i] and y == in_ys[i]`(and they both have the same type). Note that `x and y can be in the list xs and ys, but also represent subgraphs of a computational graph in xs or ys.
Parameters:  xs (list of Variable) –
 ys (list of Variable) –
Returns: Return type: bool

theano.graph.basic.
general_toposort
(outputs: Iterable[T], deps: Callable[[T], Union[theano.misc.ordered_set.OrderedSet, List[T]]], compute_deps_cache: Optional[Callable[[T], Union[theano.misc.ordered_set.OrderedSet, List[T]]]] = None, deps_cache: Optional[Dict[T, List[T]]] = None, clients: Optional[Dict[T, List[T]]] = None) → List[T][source]¶ Perform a topological sort of all nodes starting from a given node.
Parameters:  deps (callable) – A python function that takes a node as input and returns its dependence.
 compute_deps_cache (optional) – If provided deps_cache should also be provided. This is a function like deps, but that also cache its results in a dict passed as deps_cache.
 deps_cache (dict) – A dict mapping nodes to their children. This is populated by compute_deps_cache.
 clients (dict) – If a dict is passed it will be filled with a mapping of nodestoclients for each node in the subgraph.
Notes
deps(i) should behave like a pure function (no funny business with internal state).
deps(i) will be cached by this function (to be fast).
The order of the return value list is determined by the order of nodes returned by the deps() function.
deps should be provided or can be None and the caller provides compute_deps_cache and deps_cache. The second option removes a Python function call, and allows for more specialized code, so it can be faster.

theano.graph.basic.
graph_inputs
(graphs: Iterable[theano.graph.basic.Variable], blockers: Optional[Collection[theano.graph.basic.Variable]] = None) → Generator[theano.graph.basic.Variable, None, None][source]¶ Return the inputs required to compute the given Variables.
Parameters:  graphs (list of Variable instances) – Output Variable instances from which to search backward through owners.
 blockers (list of Variable instances) – A collection of `Variable`s that, when found, prevent the graph search from preceding from that point.
Yields: Variable`s – Input nodes with no owner, in the order found by a leftrecursive depthfirst search started at the nodes in `graphs.

theano.graph.basic.
io_connection_pattern
(inputs, outputs)[source]¶ Returns the connection pattern of a subgraph defined by given inputs and outputs.

theano.graph.basic.
io_toposort
(inputs: List[theano.graph.basic.Variable], outputs: List[theano.graph.basic.Variable], orderings: Optional[Dict[theano.graph.basic.Apply, List[theano.graph.basic.Apply]]] = None, clients: Optional[Dict[theano.graph.basic.Variable, List[theano.graph.basic.Variable]]] = None) → List[theano.graph.basic.Apply][source]¶ Perform topological sort from input and output nodes.
Parameters:  inputs (list or tuple of Variable instances) – Graph inputs.
 outputs (list or tuple of Apply instances) – Graph outputs.
 orderings (dict) – Keys are Apply instances, values are lists of Apply instances.
 clients (dict) – If provided, it will be filled with mappings of nodestoclients for each node in the subgraph that is sorted.

theano.graph.basic.
is_in_ancestors
(l_apply: theano.graph.basic.Apply, f_node: theano.graph.basic.Apply) → bool[source]¶ Determine if f_node is in the graph given by l_apply.
Parameters: Returns: Return type: bool

theano.graph.basic.
list_of_nodes
(inputs: Collection[theano.graph.basic.Variable], outputs: Iterable[theano.graph.basic.Variable]) → List[theano.graph.basic.Apply][source]¶ Return the Apply nodes of the graph between inputs and outputs.
Parameters:

theano.graph.basic.
nodes_constructed
()[source]¶ A contextmanager that is used in inherit_stack_trace and keeps track of all the newly created variable nodes inside an optimization. A list of new_nodes is instantiated but will be filled in a lazy manner (when Variable.notify_construction_observers is called).
observer is the entity that updates the new_nodes list. construction_observers is a list inside Variable class and contains a list of observer functions. The observer functions inside construction_observers are only called when a variable node is instantiated (where Variable.notify_construction_observers is called). When the observer function is called, a new variable node is added to the new_nodes list.
Parameters:  new_nodes – A list of all the variable nodes that are created inside the optimization.
 yields – new_nodes list.

theano.graph.basic.
op_as_string
(i, op, leaf_formatter=<class 'str'>, node_formatter=<function default_node_formatter>)[source]¶ Op to return a string representation of the subgraph between i and o

theano.graph.basic.
orphans_between
(ins: Collection[theano.graph.basic.Variable], outs: Iterable[theano.graph.basic.Variable]) → Generator[theano.graph.basic.Variable, None, None][source]¶ Extract the `Variable`s not within the subgraph between input and output nodes.
Parameters: Yields: Variable`s – The `Variable`s upon which one or more Variables in `outs depend, but are neither in ins nor in the subgraph that lies between them.
Examples
>>> orphans_between([x], [(x+y).out]) [y]

theano.graph.basic.
vars_between
(ins: Collection[theano.graph.basic.Variable], outs: Iterable[theano.graph.basic.Variable]) → Generator[theano.graph.basic.Variable, None, None][source]¶ Extract the `Variable`s within the subgraph between input and output nodes.
Parameters: Yields: Variable`s – The `Variable`s that are involved in the subgraph that lies between `ins and outs. This includes ins, outs,
orphans_between(ins, outs)
and all values of all intermediary steps from ins to outs.

theano.graph.basic.
view_roots
(node: theano.graph.basic.Variable) → List[theano.graph.basic.Variable][source]¶ Return the leaves from a search through consecutive viewmaps.

theano.graph.basic.
walk
(nodes: Iterable[T], expand: Callable[[T], Optional[Sequence[T]]], bfs: bool = True, return_children: bool = False, hash_fn: Callable[[T], Hashable] = <builtin function id>) → Generator[T, None, Dict[T, List[T]]][source]¶ Walk through a graph, either breadth or depthfirst.
Parameters:  nodes (deque) – The nodes from which to start walking.
 expand (callable) – A callable that is applied to each node in nodes, the results of
which are either new nodes to visit or
None
.  bfs (bool) – If
True
, breath first search is used; otherwise, depth first search.  return_children (bool) – If
True
, each output node will be accompanied by the output of expand (i.e. the corresponding child nodes).  hash_fn (callable) – The function used to produce hashes of the elements in nodes.
The default is
id
.
Yields: nodes – When build_inv is
True
, a inverse map is returned.Notes
A node will appear at most once in the return value, even if it appears multiple times in the nodes parameter.