sinabs

Network

class Network(analog_model: Optional = None, spiking_model: Optional = None, input_shape: Optional[Union[numpy.ndarray, List, Tuple]] = None, synops: bool = False)

Class of a spiking neural network

spiking_model

torch.nn.Module, a spiking neural network model

analog_model

torch.nn.Module, an artifical neural network model

input_shape

Tuple, size of input

synops

If True (default: False), register hooks for counting synaptic operations during forward passes, instantiating sinabs.SNNSynOpCounter.

Initializes internal Module state, shared by both nn.Module and ScriptModule.

compare_activations(data, name_list: Optional[Union[numpy.ndarray, List, Tuple]] = None, compute_rate: bool = False, verbose: bool = False) -> ([<class 'numpy.ndarray'>], [<class 'numpy.ndarray'>])

Compare activations of the analog model and the SNN for a given data sample

Parameters
  • data (np.ndarray) – Data to process

  • name_list (List[str]) – list of all layer names (str) whose activations need to be compared

  • compute_rate (bool) – True if you want to compute firing rate. By default spike count is returned

  • verbose (bool) – print debugging logs to the terminal

Returns

A tuple of lists (ann_activity, snn_activity)
  • ann_activity: output activity of the ann layers

  • snn_activity: output activity of the snn layers

Return type

tuple

get_synops(num_evs_in=None) → pandas.core.frame.DataFrame

Please see docs for sinabs.SNNSynOpCounter.get_synops().

plot_comparison(data, name_list: Optional[Union[numpy.ndarray, List, Tuple]] = None, compute_rate=False)

Plots a scatter plot of all the activations

Parameters
  • data – Data to be processed

  • name_list – ArrayLike with names of all the layers of interest to be compared

  • compute_rate – Compare firing rates instead of spike count

Returns

A tuple of lists (ann_activity, snn_activity)
  • ann_activity: output activity of the ann layers

  • snn_activity: output activity of the snn layers

Return type

tuple

reset_states()

Reset all neuron states in the submodules.