Previous CloneSet | Next CloneSet | Back to Main Report |
Clone Mass | Clones in CloneSet | Parameter Count | Clone Similarity | Syntax Category [Sequence Length] |
---|---|---|---|---|
8 | 2 | 2 | 0.950 | compound_stmt |
Clone Abstraction | Parameter Bindings |
Clone Instance (Click to see clone) | Line Count | Source Line | Source File |
---|---|---|---|
1 | 8 | 193 | Bio/NeuralNetwork/BackPropagation/Layer.py |
2 | 8 | 283 | Bio/NeuralNetwork/BackPropagation/Layer.py |
| ||||
# update each node in this network for update_node in self.nodes[1: ]: # sum up the weighted inputs from the previous network sum = 0.0 for node in previous_layer.nodes: sum+=(previous_layer.values[node]*previous_layer.weights[(node,update_node)]) self.values[update_node] = self._activation(sum) # propogate the update to the next layer |
| ||||
# update all of the nodes in this layer for update_node in self.nodes: # sum up the contribution from all of the previous inputs sum = 0.0 for node in previous_layer.nodes: sum+=(previous_layer.values[node]*previous_layer.weights[(node,update_node)]) self.values[update_node] = self._activation(sum) |
| |||
# update all of the nodes in this layer # update each node in this network for update_node in [[#variable5d6e11a0]] [[#variable5d6e12c0]]: # sum up the contribution from all of the previous inputs # sum up the weighted inputs from the previous network sum = 0.0 for node in previous_layer.nodes: sum+=(previous_layer.values[node]*previous_layer.weights[(node,update_node)]) self.values[update_node] = self._activation(sum) # propogate the update to the next layer |
CloneAbstraction |
Parameter Index | Clone Instance | Parameter Name | Value |
---|---|---|---|
1 | 1 | [[#5d6e11a0]] | self |
1 | 2 | [[#5d6e11a0]] | self.nodes |
2 | 1 | [[#5d6e12c0]] | .nodes |
2 | 2 | [[#5d6e12c0]] | [1: ] |