u/elonkingo

▲ 0 r/learnmachinelearning+1 crossposts

I want to give my python code of new networking way to you all just copy the entire text and can you use it properly and useful way because not just uses for only in Limited option if you want I can give you the simulation code also but first i want to give is python codes and i want to see how u us

import numpy as np

import random

# --------------------------

# Node Definition

# --------------------------

class Node:

def __init__(self, id):

self.id = id

self.rs = np.random.rand(4) # Resonance Signature

def similarity(self, other_rs):

return np.dot(self.rs, other_rs) / (

np.linalg.norm(self.rs) * np.linalg.norm(other_rs)

)

def receive(self, signal):

print(f"Node {self.id} received: {signal['data']}")

return self.mutate(signal)

def mutate(self, signal):

# Slight mutation

signal['energy'] *= 0.9

return signal

# --------------------------

# Encode Message

# --------------------------

def encode(message):

return {

"data": message,

"energy": 1.0,

}

# --------------------------

# Propagation Engine

# --------------------------

def propagate(sender, nodes, message, threshold=0.7):

signal = encode(message)

active_nodes = [sender]

while signal["energy"] > 0.1:

next_nodes = []

for node in nodes:

if node == sender:

continue

score = sender.similarity(node.rs)

if score > threshold:

signal = node.receive(signal)

next_nodes.append(node)

if not next_nodes:

break

sender = random.choice(next_nodes)

signal["energy"] *= 0.8

# --------------------------

# Simulation

# --------------------------

nodes = [Node(i) for i in range(10)]

sender = nodes[0]

propagate(sender, nodes, "Hello from Resonance Network!")

reddit.com
u/elonkingo — 5 hours ago
My neural network is getting better (accuracy tracking) – Day 8/30 & i discover a new networking
▲ 12 r/learnmachinelearning+1 crossposts

My neural network is getting better (accuracy tracking) – Day 8/30 & i discover a new networking

Day 7 of building a neural network from scratch in Python (no libraries). & i discover a new networking

were i want to give it as open source were i even made report if you need use it i am do inside of reddit because i don't have a git account

the image above seeing was the simulation of the network 🙄🙄🙄

i well explain this new network in other post that was reson for delay of post

Today I started tracking accuracy.

Until now, I knew the model was learning because the loss was decreasing.

But accuracy makes it clearer:

How often is the model actually correct?

Right now, the accuracy is still low — but it’s improving with each training cycle.

Example:

Epoch 1 → Accuracy: 12%

Epoch 3 → Accuracy: 28%

Epoch 5 → Accuracy: 41%

This might not look impressive yet, but it proves something important:

The model is learning.

Each iteration makes it slightly better than before.

Tomorrow, I’ll focus on improving performance and making training more efficient.

Day 8/30 ✅

I’ll update again tomorrow.

u/elonkingo — 7 hours ago
I connected everything into a training loop – Day 6/30

I connected everything into a training loop – Day 6/30

Title: I connected everything into a training loop – Day 6/30

Day 6 of building a neural network from scratch in Python (no libraries).

Today I connected everything together into a full training loop.

Until now, I had:

Forward pass (prediction)

Loss function (error)

Backpropagation (learning)

Now the model does this repeatedly:

Take input

Make prediction

Calculate loss

Adjust weights

Repeat

This loop is what actually trains the model.

Right now, it's still early — but the system is officially learning.

Even small improvements mean the logic is working.

Tomorrow, I’ll focus on tracking performance and seeing if accuracy improves over time.

Day 6/30 ✅

I’ll update again tomorrow.

u/elonkingo — 2 days ago
Image 1 — How a neural network actually learns (Backpropagation) – Day 5/30
Image 2 — How a neural network actually learns (Backpropagation) – Day 5/30

How a neural network actually learns (Backpropagation) – Day 5/30

Day 5 of building a neural network from scratch in Python (no libraries) by using mobile .

Until now:

The model can produce output

It can measure how wrong it is (loss)

But today is the real question:

How does it improve?

This is where backpropagation comes in.

In simple terms:

The model takes the error (loss) and sends it backward through the network.

While going backward, it adjusts:

Weights

Biases

So next time, the prediction is slightly better.

Think of it like this:

You make a mistake

You understand what went wrong

You adjust your approach

That’s exactly what the network is doing.

Today, I started implementing this learning process.

Tomorrow, I’ll connect everything together into a full training loop.

Day 5/30 ✅

I’ll update again tomorrow.

u/elonkingo — 3 days ago
Title: How does a neural network know it’s wrong? (Loss Function) – Day 4/30

Title: How does a neural network know it’s wrong? (Loss Function) – Day 4/30

Day 4 of building a neural network from scratch in Python (no libraries). and i am useing only a mobile not pc from the beginning

Yesterday, the model produced its first output.

Today, I asked a simple question:

How does the model know if it’s wrong?

That’s where the loss function comes in.

A loss function measures the difference between:

What the model predicted

What the correct answer actually is

Example:

If the model predicts “3” but the correct answer is “7”, the loss will be high.

If it predicts correctly, the loss will be low.

So basically:

Loss = how wrong the model is

This value is what we’ll use to improve the model in the next step.

Tomorrow, I’ll start working on how the model learns from this error (backpropagation).

Day 4/30 ✅

I’ll update again tomorrow.

u/elonkingo — 4 days ago
My neural network produced its first output (forward pass) – Day 3/30

My neural network produced its first output (forward pass) – Day 3/30

My neural network produced its first output (forward pass) – Day 3/30

Day 3 of building a neural network from scratch in Python (no libraries).

Today I implemented the forward pass — the part where the network actually produces an output.

This is the first time it feels like something real.

Right now, the output is basically random because the model hasn’t learned anything yet.

But the important part is:

The data is flowing through the network correctly.

Input → Hidden layers → Output

Each step:

Multiply by weights

Add bias

Apply activation

And finally, it produces a result.

Even though it’s not accurate yet, this is the first real step toward a working model.

Tomorrow, I’ll work on improving this by introducing a way to measure how wrong the output is (loss function).

Day 3/30 ✅

I’ll update again tomorrow.

u/elonkingo — 5 days ago
What is a Neural Network (simple explanation with math) – Day 2/30

What is a Neural Network (simple explanation with math) – Day 2/30

​

Day 2 of building a neural network from scratch in Python (no libraries).

Before going deeper, I wanted to understand one thing clearly:

What actually *is* a neural network?

The simplest way to think about it:

A neural network is just a system that takes input, processes it, and gives an output.

Example:

* Input → image of a digit

* Output → predicted number (like 7)

Between input and output, there are layers that:

* Multiply numbers (weights)

* Add values (bias)

* Pass through functions (activation)

That’s it.

No magic. Just math + logic.

##############################

Linear Algebra (Custom Matrix Engine)

In code:=========================

def dot(self,o):

s += self.data[i][k] * o.data[k][j]

==============================

Algorithm:

Matrix Multiplication (Dot Product)

Time Complexity: O(n³)

Also used:

Matrix Addition → add()

Matrix Transpose → transpose()

👉 This is core computation engine

This is exactly how my matrix engine works internally 🔥

What you seeing now visual representation of my custom matrix in image & explain:

Left (Matrix A) → input (like image pixels or activations)

bottom left (Matrix B) → weights

Right (Result) → output after multiplication (A × B)

_________________________________

  1. Forward Propagation (Linear Transformation)

Algorithm:========

Z=XW+b

================

👉 This is repeated across layers:

784 → 128 → 64 → 10

##############################

Today, I focused on understanding how data flows through these layers.

Tomorrow, I’ll start implementing python the actual forward pass. for now it's on notes

Day 2/30 ✅

I’ll update again tomorrow.

u/elonkingo — 7 days ago
I’m building a neural network from scratch in Python (no libraries) – Day 1/30

I’m building a neural network from scratch in Python (no libraries) – Day 1/30

Hey everyone,

I’ve started a 30-day challenge to build a neural network completely from scratch using pure Python.

No TensorFlow.

No PyTorch.

No NumPy.

The goal is simple:

I want to understand how AI actually works internally — not just use pre-built tools.

For Day 1, I’ve started working on the basic structure:

Input layer

Hidden layers

Output layer

Right now, it doesn’t “learn” anything yet — but the foundation is ready.

Over the next 30 days, I’ll be posting daily updates:

Backpropagation

Training loop

Real dataset (MNIST)

Final working model

If anyone has suggestions or wants to follow along, feel free to join the journey.

Let’s see where this goes.

Day 1/30 ✅

I’ll update again tomorrow.

u/elonkingo — 8 days ago