Phisualize It! — Revealing the Invisible World in Real Time

by shredermann in Circuits > Arduino

721 Views, 6 Favorites, 0 Comments

Phisualize It! — Revealing the Invisible World in Real Time

PhisualizeIt.jpg
ring.jpg
HeroShot.jpg
matrice8x8.jpg

INTRODUCTION: PHISUALIZE IT!

Just don't visualize… Phisualize it.

Phisualize It! is an experimental educational instrument designed to make invisible physical phenomena visible in real time.

One sensor platform. Ten invisible phenomena.

This Instructable focuses on the 8×8 LED matrix reference build, a compact and reproducible version of the system.

More advanced form factors — such as PhiP350, built from recycled speaker LEDs — are possible, but require additional hardware adaptation and will be covered in a separate tutorial.

What You'll Detect

Ten invisible phenomena rendered in real time:

  1. Vibrations — Mechanical energy (IMU accelerometer)
  2. Audio — Structured sound with harmonics (microphone + spectral analysis)
  3. Electromagnetic interference — Power lines, switching noise (same microphone)
  4. Magnetic fields — Earth's field + perturbations (magnetometer)
  5. Infrared proximity — IR reflection detection
  6. GPS signals — Satellite tracking and signal quality
  7. Bluetooth connectivity — Wireless signal strength
  8. Temperature — Ambient thermal conditions
  9. Humidity — Relative humidity levels
  10. Atmospheric pressure — Barometric variations

What You'll Learn

By building the 8×8 reference version, you will discover how to:

  1. Distinguish structured audio from electromagnetic interference using spectral analysis
  2. Run real-time FFT processing on a microcontroller
  3. Combine multiple sensors into a coherent scientific model
  4. Design a modular LED visualization engine adaptable to any display
  5. Apply scientific normalization without artificial amplification or visual cheating

This project prioritizes honesty over spectacle.

Key Innovation

One microphone. Two invisible worlds.

The same PDM microphone captures both:

  1. Acoustic sound (voice, music, harmonics)
  2. Electromagnetic interference (power lines, switching noise)

They are separated in real time using spectral flatness analysis — a technique from professional audio DSP.

Pure signal processing. No guesswork.

Why "Phisualize"?

Most projects visualize data — they turn measurements into graphs, charts, or numbers on a screen.

This project does something different.

Phisualize It! is a play on words, but also a statement.

The name comes from Φ (phi) — the Greek letter for Physics.

Phi + Visualize = translating physics into perception.

To visualize is to represent data.

To phisualize is to translate physics itself into perception.

Here, light is not decoration. It is a language.

  1. Every brightness level corresponds to a measured intensity
  2. Every color reflects a physical phenomenon

Nothing is invented.

Nothing is exaggerated.

You don't just visualize the invisible world.

You Phisualize it.

Ready to see the invisible?

Let's build.

Supplies

Matrice8x8led.jpeg
Nano 33BLE sense rev2.jpeg
Jumper wires (male:female).jpg
ESP32_2432S028R.jpeg
moduleGPS.jpeg

Hardware and Philosophy

Design Principle

Most projects solve problems by adding hardware.

This one solves them by extracting more information from less hardware.

Constraint is not a limitation — it is the design method.

Core Processing

Arduino Nano 33 BLE Sense Rev2

A single microcontroller board hosting multiple integrated sensors:

  1. IMU (BMI270 accelerometer + gyroscope)
  2. PDM microphone (MP34DT06JTR)
  3. Magnetometer (BMM150)
  4. Barometric pressure sensor (LPS22HB)
  5. Infrared proximity sensor (APDS9960)
  6. Temperature / humidity sensor (HS3003)
  7. Bluetooth Low Energy radio (nRF52840 — RSSI measurement)

From this limited set, the system extracts several distinct physical phenomena using filtering, FFT, and classification — instead of adding dedicated hardware.

One board.

One CPU.

Multiple invisible worlds.

External Sensor (Minimal but Essential)

GY-GPS6MV2 GPS module (u-blox NEO-6M)

Added for a single reason:

GPS satellite signals are physically inaccessible without a receiver.

It is used not for navigation, but to visualize radio signals from satellites 20,000 km above Earth, transmitting at 1.5 GHz.

The visualization tracks:

  1. Satellite presence
  2. Signal acquisition
  3. Live radio activity from space

Visualization (Reference Build)

8×8 RGB LED Matrix (WS2812 / NeoPixel compatible)

The reference visualization used in this Instructable is:

  1. Low cost
  2. Widely available
  3. Easy to reproduce

The limited resolution is intentional.

It forces perceptual clarity over graphical complexity.

Brightness = measured intensity (linear mapping).

No artificial amplification.

No decorative scaling.

Monitoring and Transparency (Fully Optional)

ESP32-2432S028R touchscreen (NOT required)

This is an optional wireless monitoring interface.

The instrument functions completely without it.

When present, it displays:

  1. Raw sensor values
  2. FFT data
  3. Classification states

All sensing, processing, and visualization run entirely on the Arduino.

The ESP32 observes — it does not compute.

Prototyping, Power, and Reuse

  1. Breadboard
  2. Jumper wires
  3. Standard 5 V USB power supply
  4. Recycled enclosures, cables, and LED hardware whenever possible

No custom PCB.

No laboratory equipment.

No hidden hardware.

Cost and Constraint

  1. Total cost (reference build): ~€50–70
  2. Flash usage: ~40%
  3. RAM usage: ~28%

The remaining resources are preserved to guarantee:

  1. Deterministic timing
  2. Stable FFT processing
  3. Reproducible behavior

This project was intentionally built with very little hardware.

The challenge was not to multiply sensors,

but to push each sensor beyond its usual role through signal processing.

System Architecture

Software Architecture — Modular Scientific Engine.png
System Architecture — From Physics to Light.png
architecture_1_hardware.png
architecture_2_software.png
architecture_3_dataflow.png
architecture_4_wiring.png

This step explains how the system works as a whole.

Not component by component, but flow by flow.

The core idea is simple:

Invisible physical phenomena → signal processing → light

Everything runs locally on the microcontroller.

No cloud dependency.

No external processing.

Pure real-time perception.

Event-Driven Coordination

At the heart of Phisualize It! is a coordinated system running on the Arduino Nano 33 BLE Sense Rev2.

Multiple processes run in parallel:

  1. PDM microphone — interrupt-driven audio capture
  2. IMU sampling — timer-based (90 Hz precise timing)
  3. GPS serial parsing — asynchronous NMEA data
  4. Main loop — orchestrates FFT, classification, and visualization

Short-term buffering for FFT (128 samples) ensures stable spectral analysis.

No long-term storage.

No playback mode.

What you see is exactly what is happening — now.

Sensing Layer — Listening to the Invisible

The system listens to multiple physical domains simultaneously.

Mechanical

Vibrations are extracted using a high-pass filter (fc ≈ 1.7 Hz @ 90 Hz) that removes Earth's gravity and isolates dynamic acceleration.

Acoustic & Electromagnetic

A single PDM microphone captures pressure variations.

Spectral analysis distinguishes:

  1. structured sound (voice, music) — harmonic peaks
  2. electromagnetic interference (power lines, switching noise) — flat spectrum

Magnetic

Local perturbations of Earth's magnetic field are measured as deltas from a calibrated baseline.

Environmental

Pressure, temperature, and humidity are tracked as changes over time, not absolute values.

Radio

  1. GPS module detects satellite presence and signal acquisition
  2. BLE radio provides signal strength as a physical phenomenon

Each sensor delivers raw, unoptimized data.

No cosmetic smoothing.

No artificial amplification.

Processing Layer — Making Sense of Signals

Raw signals alone are meaningless.

This is where Phisualize It! differs from most LED projects.

Key operations include:

  1. High-pass and low-pass filtering (gravity removal, noise rejection)
  2. Fast Fourier Transform (FFT) (frequency domain analysis)
  3. Energy extraction (dominant spectral amplitude)
  4. Spectral flatness calculation (audio vs EMI classification)
  5. Baseline subtraction (magnetic, barometric, temperature deltas)
  6. Fixed-scale normalization (physical reference values)

All processing is deterministic and traceable.

Every transformation maps directly to a physical measurement.

No hidden coefficients.

No "magic values".

Only temporal low-pass filtering for perceptual stability and noise rejection.

Classification Without Guesswork

Classification is physics-based, not heuristic.

Example (microphone):

  1. Spectral flatness > 0.45 → electromagnetic interference (flat broadband spectrum)
  2. Spectral flatness < 0.40 → structured audio (harmonic peaks)
  3. Hysteresis (0.40–0.45) → prevents rapid oscillation between states

The same sensor.

Two different physical realities.

Separated mathematically, not by guessing.

Visualization Layer — Light as a Language

Once normalized, physical intensities are sent to the visualization engine.

A strict rule applies:

Brightness always represents measured intensity.

  1. Dim light → weak phenomenon
  2. Bright light → strong phenomenon
  3. Flicker → change in the physical world

No dynamic auto-scaling.

Each phenomenon has a fixed physical reference:

  1. Vibrations: 0.10 g
  2. Audio: 0.02 amplitude
  3. EMI: 0.005 amplitude
  4. Magnetic field: 50 µT
  5. Proximity: 1.0 (full scale)
  6. Environmental deltas: 5 hPa / 10°C / 30% RH

Brightness reflects true measured intensity against this reference.

The LED matrix becomes a perceptual surface, not a display.

Display-Agnostic by Design

The visualization engine does not care what LEDs are connected.

The same code can drive:

  1. An 8×8 LED matrix (reference build)
  2. A circular LED ring (I did with 64 leds)
  3. Recycled LED bars from consumer devices (I used a broken earz-P350)

Only the physical mapping changes.

The scientific model remains identical.

Optional Monitoring — Seeing the Numbers

An optional ESP32 touchscreen can connect via BLE.

Its role is strictly observational:

  1. Raw values
  2. FFT magnitudes
  3. Classification states
  4. Internal variables

Think of it as an oscilloscope for perception.

Remove it, and nothing breaks.

Why This Architecture Matters

Most visualization projects start with visuals and add meaning later.

This one does the opposite:

Physics first.

Processing second.

Light last.

You are not watching animations.

You are watching the invisible world express itself — unfiltered, unoptimized, authentic.

Next Step

We'll explore the science behind each phenomenon:

what it is, how it's measured, and why the math works.

The Science: Turning Signals Into Meaning

IMU SIGNAL PROCESSING.jpg
diagram_2_audio_emi_classification.png
diagram_3_spectrum_8bands.png
diagram_4_vibro_acoustic_coupling.png
diagram_5_brightness_reality.png
diagram_6_all_phenomena_overview-2.png

This step explains why the math matters — and why the visuals are trustworthy.

From Time to Frequency — Why FFT Matters

Most physical sensors deliver time-domain signals:

  1. acceleration over time
  2. pressure variations over time
  3. microphone amplitude over time

But many invisible phenomena are not defined by when they happen —

they are defined by how their energy is distributed across frequencies.

That is why Phisualize It! relies on the Fast Fourier Transform (FFT).

FFT Processing Pipeline

TIME DOMAIN (raw signal)
x[n] = sensor samples over time
(IMU or PDM microphone)
|
| Windowing (Hann)
v
xw[n] = x[n] × Hann[n]
(reduces spectral leakage)
|
| FFT (N = 128 samples)
v
X[k] = FFT(xw[n])
(complex frequency bins)
|
| Magnitude
v
|X[k]| = √(Re² + Im²)
(frequency amplitude spectrum)
|
+-----------------------------+
| |
v v
AMPLITUDE EXTRACTION SPECTRAL FLATNESS
- dominant peak - geometric mean / mean
- band energy - structure vs noise
|
v
NORMALIZATION (fixed physical scale)
No adaptive gain. No visual cheating.
|
v
VISUAL TRANSLATION
Brightness = measured intensity

Why 128 Samples? Why These Rates?

FFT parameters are chosen for physical relevance, not aesthetics.

Audio / EMI

  1. Sample rate: 16 kHz
  2. FFT size: 128 samples
  3. Frequency resolution: 125 Hz per bin

Sufficient to:

  1. Detect harmonic structure (voice, music)
  2. Detect broadband noise (EMI)
  3. Separate the two reliably in real time

Vibrations (IMU)

  1. Sample rate: 90 Hz (precise timer-based)
  2. FFT size: 128 samples
  3. Frequency range: 0–45 Hz

Captures:

  1. Footsteps
  2. Table vibrations
  3. Structural resonances
  4. Environmental micro-movements

FFT Configuration Summary

┌──────────┬─────────┬───────┬────────────┬─────────────┐
│ Signal │ Rate. │ N │ Resolution │ Range │
├──────────┼─────────┼───────┼────────────┼─────────────┤
│ Audio │ 16 kHz │ 128 │ 125 Hz/bin │ 0–8 kHz │
│ IMU │ 90 Hz │ 128 │ 0.7 Hz/bin │ 0–45 Hz │
└──────────┴─────────┴───────┴────────────┴─────────────┘

One Microphone, Two Physical Worlds

The microphone is the most unconventional element of the system.

It captures:

  1. Acoustic pressure waves (sound)
  2. Electromagnetic interference (induced noise)

These two phenomena are invisible, simultaneous, and mixed in the same signal.

Audio vs EMI — Spectral Classification

PDM MICROPHONE (single sensor)
captures pressure + induced noise
|
v
Time-domain signal x[n]
(mixed audio + EMI)
|
| FFT (128 samples @ 16 kHz)
v
Frequency spectrum |X[k]|
|
| Spectral Flatness (SF)
v
Is the spectrum structured or flat?
|
+------+------+
| |
SF < 0.40 SF > 0.45
(harmonic) (broadband)
| |
v v
AUDIO EMI
Voice / music Power supplies
Tonal structure Switching noise

The distinction between sound and electromagnetic interference is spectral, not volumetric.

Physics-based thresholds. No arbitrary guessing.

Spectral Flatness — Physics, Not Guesswork

Spectral flatness measures how structured a spectrum is.

  1. Flat spectrum → energy evenly distributed → EMI
  2. Peaked spectrum → harmonic structure → Audio

Mathematically:

SF = geometric_mean(|X[k]|) / arithmetic_mean(|X[k]|)

SF → 0 : harmonic, tonal → audio
SF → 1 : broadband, unstructured → EMI

Example in Practice:

  1. Speaking voice at 200 Hz fundamental
  2. → Strong harmonics at 200, 400, 600 Hz
  3. → SF ≈ 0.25 → AUDIO
  4. Switched-mode power supply
  5. → Broadband noise 20 Hz–8 kHz
  6. → SF ≈ 0.65 → EMI

Same sensor.

Different physics.

Separated mathematically.

Spectral Amplitude, Not Raw Signal

The system does not visualize raw sensor readings.

It visualizes spectral amplitude:

  1. Dominant spectral amplitude for vibrations (peak FFT magnitude)
  2. Dominant spectral amplitude for audio and EMI (peak FFT magnitude)
  3. Deltas from baseline for magnetic and environmental data

This represents the strongest frequency component,

providing a stable measure of phenomenon intensity.

This is why the visualization feels stable, physical, and grounded.

Fixed-Scale Normalization (Critical Choice)

Most visualizations adapt to recent maxima.

This one does not.

Brightness = Reality

PHYSICAL INTENSITY
(measured, normalized)
|
v
Normalized Energy (0.0–1.0)
Fixed physical scale
No auto-gain
|
| Linear mapping
v
LED Brightness (30–255)

Dim → weak phenomenon
Bright → strong phenomenon
Flicker→ change

Each phenomenon maps to a fixed physical reference:

  1. Vibrations: 0.10 g
  2. Audio amplitude: 0.02
  3. EMI amplitude: 0.005
  4. Magnetic field delta: 50 µT
  5. Environmental deltas: 5 hPa / 10°C / 30% RH

This ensures:

  1. Comparability over time
  2. No artificial exaggeration
  3. Honest brightness

If nothing happens, the LEDs stay dim.

Why LEDs Never Go Completely Dark

  1. Thermal noise exists
  2. Quantization noise exists
  3. Ambient fields exist

A minimum visibility baseline (2% brightness) ensures these real background phenomena remain visible.

Zero brightness would be a lie.

That is not a bug.

That is reality.

Why This Matters

This step defines the difference between:

  1. Pretty lights reacting to sensors
  2. A perceptual scientific instrument

Every pixel corresponds to:

  1. A physical measurement
  2. A mathematical transformation
  3. A documented scale

Nothing is invented.

Nothing is hidden.

Next Step

We move from theory to practice:

building the hardware, wiring the system, flashing the code, and calibrating the instrument.

Build and Program

Nano33BLEsenseRev2.png
PhisualizeItSysteme.jpg
architecture_4_wiring.png
SerialMonitor.jpg

This is where Phisualize It! stops being an idea and becomes a real instrument.

Just a small microcontroller,

a handful of wires,

and code doing the heavy lifting.

4.1 What You're Actually Building

The reference build in this Instructable is intentionally simple:

  1. Arduino Nano 33 BLE Sense Rev2
  2. One 8×8 RGB LED matrix (WS2812 / NeoPixel compatible)
  3. One GPS module (GY-GPS6MV2)
  4. External 5V power supply

This version is:

  1. Fully functional
  2. Fully documented
  3. Easy to reproduce

It is scientifically identical to more advanced displays (PhiRing, PhiP350).

Same engine.

Different output.

4.2 Power Requirements

LED Matrix (8×8):

  1. Maximum consumption: ~2A @ 5V (full brightness white)
  2. Typical usage: 0.5–1A @ 5V (normal operation)

Arduino Nano 33 BLE Sense Rev2:

  1. Consumption: ~50 mA @ 5V (USB power)

Power supply:

  1. Use quality USB power supply (≥2A rated)
  2. Do NOT power LEDs from Arduino 5V pin

4.3 Wiring — Minimal by Design

The entire system runs on three external connections.

LED Matrix

  1. Data → D4 (Arduino)
  2. VCC → External 5V supply
  3. GND → Common ground

Optional but recommended:

  1. 330Ω resistor on data line (LED protection)
  2. 1000µF capacitor on LED power supply (noise filtering)

GPS Module (Serial1 — Hardware UART)

  1. GPS TX → D0 (RX1)
  2. GPS RX → D1 (TX1)
  3. VCC → 3.3V
  4. GND → Common ground

Note: D0/D1 are Serial1 pins on the Nano 33 BLE Sense Rev2

That's it.

No level shifters.

No complicated circuits.

No magic.

⚠️ Important:

Do not power the LED matrix from the Arduino 5V pin.

Use an external 5V supply and share ground.

Frugal engineering does not mean unsafe engineering.

4.4 Physical Assembly — Frugal on Purpose

There is no enclosure STL here.

I used a recycled transparent phone case.

Why?

  1. It diffuses the LEDs naturally
  2. It protects the electronics
  3. It costs almost nothing
  4. It keeps the project accessible

The Arduino and GPS are fixed with double-sided tape.

The matrix faces outward.

Nothing fancy — and nothing hidden.

This is intentional.

4.5 Software Setup

Arduino IDE Setup

  1. Arduino IDE 2.0 or newer
  2. Board: Arduino Mbed OS Nano Boards → Arduino Nano 33 BLE
  3. Board package: Install via Boards Manager

Required Libraries (Install via Library Manager)

Sensor drivers:

  1. Arduino_BMI270_BMM150 (IMU + Magnetometer)
  2. Arduino_APDS9960 (Proximity sensor)
  3. Arduino_LPS22HB (Barometer)
  4. Arduino_HS300x (Temperature/Humidity)

Communication:

  1. ArduinoBLE (Bluetooth Low Energy)
  2. PDM (included with board package)

Output:

  1. Adafruit_NeoPixel (LED control)

Signal processing:

  1. arduinoFFT (v2.0 or newer)

All libraries are free and available through the Arduino Library Manager.

4.6 Code Architecture — Modular, Not Magical

You are not uploading a single unreadable .ino file.

The project is split into clear, independent modules, each responsible for one physical domain.

Sensing Modules

  1. PhiIMU — vibrations (accelerometer)
  2. PhiAudio — audio + EMI capture (microphone)
  3. PhiMagnetometer — magnetic field
  4. PhiProximity — IR proximity
  5. PhiBarometer — atmospheric pressure
  6. PhiHumidity — temperature & humidity
  7. PhiGPS — satellite signals

Processing Module

  1. PhiScientificModel — FFT, spectral analysis, classification

Output Modules

  1. PhiMatrixEngine — visualization logic
  2. PhiBLE — optional wireless telemetry

Each file can be read, understood, and modified on its own.

This is not accidental.

It is a teaching choice.

4.7 Upload and Verify

  1. Open the main sketch in Arduino IDE
  2. Select board: Arduino Nano 33 BLE Sense Rev2
  3. Select port: (your USB port)
  4. Upload

Verify Successful Upload

Open Serial Monitor (115200 baud).

You should see initialization messages:

✓ PhiIMU (90 Hz vibration energy sampling)
✓ PhiAudio (16 kHz, spectral Audio/EMI classification)
✓ PhiMagnetometer initialisé (BMM150)
✓ PhiProximity initialisé (APDS9960)
✓ PhiBarometer initialisé (LPS22HB, ×10 correction)
✓ PhiHumidity initialisé (HS3003 - RARE)
✓ PhiGPS (GY-GPS6MV2 + satellites + timeout)
✓ PhiBLE initialisé
✓ PhiScientificModel (Audio/EMI spectral classification)

Each module should report ✓ on successful initialization.

4.8 Calibration Process — Automatic

Before the system can see change, it must know what normal looks like.

On first boot, the system automatically calibrates:

Magnetometer baseline:

  1. Samples: ~200 (2 seconds)
  2. Measures: Earth's magnetic field at current location

Barometer baseline:

  1. Samples: ~200 (2 seconds)
  2. Measures: Current atmospheric pressure

Temperature/Humidity baseline:

  1. Samples: ~50 (5 seconds)
  2. Measures: Ambient conditions

Total calibration time: ~10 seconds

Requirements during calibration:

  1. Device must be stationary
  2. No strong magnetic sources nearby
  3. Stable environment

Serial Monitor will display: "✅ Calibration OK"

From that point on, only deltas are visualized.

This is critical:

  1. No fake movement
  2. No artificial contrast
  3. No "always exciting" visuals

If nothing changes, the LEDs stay calm.

That calm is real.

You can recalibrate at any time via Serial command.

4.9 Display Modes

The system has 9 display modes:

Mode 0: ALL (default)

  1. 8-column overview
  2. Each column = one phenomenon (except last column =3)
  3. Simultaneous visualization

Modes 1-8: Individual phenomena (full matrix)

  1. Mode 1: IMU (spiral pattern)
  2. Mode 2: AUDIO (VU-meter spectrum)
  3. Mode 3: EMI (noise pattern)
  4. Mode 4: MAGNETIC (PHI symbol)
  5. Mode 5: PROXIMITY (IR circles)
  6. Mode 6: GPS (satellite visualization)
  7. Mode 7: BLE (signal strength)
  8. Mode 8: ENV (environmental deltas)

Switch modes:

  1. Send 0, 1, 2, 3, 4, 5, 6, 7, or 8 via Serial Monitor
  2. Selects the corresponding mode directly

4.10 First Power-On — What You Should See

On startup:

  1. The matrix lights faintly
  2. Mode 0 (ALL) is active
  3. Every column is visible, even in silence

This is expected.

There is always:

  1. Vibration noise
  2. Electromagnetic noise
  3. Thermal drift
  4. Radio activity

Physics never sleeps.

4.11 Optional: BLE Monitor (ESP32)

An ESP32 touchscreen can connect wirelessly via Bluetooth.

Its role is strictly observational:

  1. Raw sensor values
  2. FFT magnitudes
  3. Classification states
  4. Internal variables

Setup:

  1. ESP32 scans for "Phisualize it!" BLE device
  2. Connects automatically
  3. Displays real-time telemetry

Remove it, and nothing breaks.

The Arduino runs completely independently.

4.12 Troubleshooting

Common Issues

"Library not found"

  1. → Install missing libraries via Library Manager
  2. → Verify library names exactly match (case-sensitive)

LEDs stay completely dark

  1. → Check external 5V power supply connection
  2. → Verify data wire on D4
  3. → Test with simple Adafruit NeoPixel example first

GPS shows 0 satellites continuously

  1. → Place device near window (clear sky view)
  2. → Wait 2–5 minutes (cold start acquisition time)
  3. → Check TX/RX connections (GPS TX → Arduino RX)

Compilation error

  1. → Verify board selection: "Arduino Nano 33 BLE Sense Rev2"
  2. → Update board package to latest version
  3. → Check Arduino IDE version ≥ 2.0

Serial Monitor shows gibberish

  1. → Set baud rate to 115200
  2. → Reset Arduino after opening Serial Monitor

Why This Step Matters

At this point, you haven't just "built a project".

You've assembled:

  1. A sensor fusion system
  2. A real-time DSP pipeline
  3. A perceptual display engine

Running on a €45 microcontroller.

Next Step

We stop explaining — and start seeing.


Below , all the files needed for the nano , place those 22 files in the same folder(Phisualizeit for exemple) and it will compile and upload on the Nano 33 BLE sense rev 2 .

PhisualizeItMonitor is standalone ... place it in another folder (PhisualizeItMonitor for exemple). Not needed but if you want to do monitoring, you should use it!

Visualization Modes

Audio1.jpg
Audio2.jpg
BLE1.jpg
BLE2.jpg
EMI1.jpg
EMI2.jpg
ENV1.jpg
ENV2.jpg
Env3.jpg
ENV4.jpg
GroundIssue.jpg
IMU1.jpg
IMU2.jpg
IMU3.jpg
MAG1.jpg
MAG2.jpg
MAG3.jpg
MAG4.jpg
Prox1.jpg
Prox2.jpg
Prox3.jpg
SAT1.jpg
SAT2.jpg
SAT3.jpg

SEEING THE INVISIBLE, FOR REAL

This is where Phisualize It! becomes self-explanatory.

No charts.

No numbers.

No explanations needed once you see it.

Each mode is not an animation.

It is a translation — from measured physics to visible light.

Note: Video demonstrations of each mode are included in the project gallery. Seeing the transitions in real time is essential to understanding the system.

5.1 How to Switch Modes

Send a number (0–8) via Serial Monitor (115200 baud):

  1. 0 → ALL (overview)
  2. 1 → IMU (vibrations)
  3. 2 → AUDIO (structured sound)
  4. 3 → EMI (electromagnetic interference)
  5. 4 → MAGNETIC (field variations)
  6. 5 → PROXIMITY (infrared)
  7. 6 → GPS (satellites)
  8. 7 → BLE (radio signal)
  9. 8 → ENVIRONMENT (pressure/temp/humidity)

Current mode is displayed in Serial Monitor.

5.2 Response Times

Different phenomena operate at different speeds:

┌─────────────┬────────────────────────────────┐
│ Mode │ Response Time │
├─────────────┼────────────────────────────────┤
│ IMU │ Real-time (<100 ms) │
│ Audio │ Real-time (<100 ms) │
│ EMI │ Real-time (<100 ms) │
│ Magnetic │ Fast (~200 ms) │
│ Proximity │ Fast (~200 ms) │
│ GPS │ Slow (2–30 seconds/satellite) │
│ BLE │ Fast (~200 ms) │
│ Environment │ Very slow (minutes to hours) │
└─────────────┴────────────────────────────────┘

This is not a limitation.

This is how physics works.

5.3 Mode 0 — ALL (The Overview)


What It Shows

All detected phenomena at once, displayed as vertical columns.

Each column represents one physical domain:

Column Layout (left to right):

Column 0: IMU — Red (vibrations)

Column 1: AUDIO — Yellow (structured sound)

Column 2: EMI — Magenta (electromagnetic interference)

Column 3: MAGNETIC — Green (field variations)

Column 4: PROXIMITY — Orange (infrared)

Column 5: GPS — Blue (satellite signals)

Column 6: BLE — Cyan (radio strength)

Column 7: ENVIRONMENT — Purple (pressure/temp/humidity)

How to Read It

  1. Brightness = current physical intensity
  2. This is the mode you leave running

Even in a "quiet" room, nothing is completely dark.

There is always:

  1. Vibration noise
  2. Electromagnetic interference
  3. Ambient fields
  4. Radio activity

That faint glow is not aesthetic.

It is reality.

This single mode already proves the system is alive.

Why 8 Columns for 10 Phenomena?

The 8×8 LED matrix has physical constraints.

Environmental phenomena (pressure, temperature, humidity) are:

- Slow-changing (minutes to hours)

- Contextually related (atmospheric conditions)

- Combined into a single composite visualization

In Mode 8 (dedicated environment view):

All three are shown separately with full matrix resolution.

In Mode 0 (overview):

They merge into column 7 as an environmental "weather index".

This is a display constraint, not a sensing limitation.

The system still measures all three independently.

5.4 Mode 1 — Vibrations (IMU)


What You See

A compact spiral pattern at the center that expands outward as vibration energy increases.

The spiral fills progressively:

  1. Center only → minimal vibration (baseline noise)
  2. Partial expansion → moderate vibration
  3. Full matrix → strong impact

What It Means

You are seeing mechanical energy propagating through the environment.

Try This:

  1. Tap the table → immediate bright response
  2. Walk nearby → slow oscillations
  3. Heavy impact → full spiral expansion
  4. Set down an object → sharp transient

No cosmetic smoothing.

No artificial amplification.

Just acceleration, made visible.

5.5 Modes 2 and 3 — Audio vs EMI (Two Sides of One Sensor)


These two modes use the same microphone.

They are separated mathematically using spectral flatness analysis.

Mode 2: AUDIO (Structured Sound)

What you see:

  1. 8-column VU-meter (vertical bars)
  2. Each column = one frequency band (125 Hz wide)
  3. Colors correspond to harmonic bands

Structured sound:

  1. Voice
  2. Music
  3. Tonal sources

You see harmonic structure.

Energy concentrates in specific frequency regions.

Try This:

  1. Speak or whistle → distinct harmonic peaks
  2. Play music → rhythmic patterns
  3. Hum at different pitches → energy shifts between columns

Mode 3: EMI (Electromagnetic Interference)

What you see:

  1. Scattered noise pattern (random pixel distribution)
  2. Colors reflect spectral energy distribution
  3. Chaotic, unstructured

Unstructured interference:

  1. Power supplies
  2. Mains hum (50/60 Hz)
  3. Switching noise

You see distributed, chaotic energy.

Try This:

  1. Bring a phone charger near the device
  2. Turn on/off fluorescent lights
  3. Use a laptop power supply nearby
  4. Wave your phone while recording video (switching power regulator)

The system switches automatically using spectral flatness.

This is physics.

One sensor.

Two invisible realities.

5.6 Mode 4 — Magnetic Field


What You See

A PHI symbol (circle + vertical bar) that expands progressively as magnetic field variations increase.

  1. Baseline state: Small circle + bar (Earth's field stable)
  2. Perturbation: Pattern expands outward from center
  3. Strong disturbance: Full matrix fills

What It Means

You are watching variations in Earth's magnetic field, visualized as deltas from calibrated baseline.

Earth's field: ~30–60 µT (depending on location)

Visible perturbations: >1 µT

Invisible.

Slow.

Very real.

Try This:

  1. Pass a magnet slowly nearby → progressive expansion
  2. Rotate a metallic object (screwdriver, keys) → field distortion
  3. Move your phone near the sensor → magnetic interference
  4. Bring ferromagnetic material close → field concentration


5.7 Mode 5 — Proximity (Infrared Reflection)

What You See

Concentric circles expanding from the center as objects approach.

Pattern behavior:

  1. Far (>20 cm): Small circle, bright
  2. Medium (5–20 cm): Multiple expanding rings, medium brightness
  3. Very close (<5 cm): Many circles fill matrix, dim

What It Actually Measures

The APDS9960 does not measure distance directly.

It measures infrared light reflection.

Sensor physics (real behavior, not corrected):

  1. Far: Low IR return → sensor reads high value (255) → bright
  2. Close: High IR return → sensor saturates → reads low value (0) → dim

This is the actual physical response of the APDS9960.

No inversion.

No correction.

Raw sensor truth.

What This Reveals

This mode exposes a real sensor limitation:

Proximity sensors saturate at very close range.

In industrial applications, this would be compensated with lookup tables or inverse mapping.

Here, it is deliberately left unprocessed to show:

  1. What the sensor actually sees
  2. How saturation behaves
  3. The difference between "measuring distance" and "measuring IR reflection"

Brightness still equals reality — the reality of photon flux, not geometric distance.

Try This:

  1. Slowly move your hand closer → watch pattern expand while brightness decreases
  2. Hold reflective material (metal foil) → earlier saturation
  3. Hold absorbent material (black fabric) → delayed saturation
  4. Compare with ruler measurements → see non-linear response curve

This is what transparency looks like.

5.8 Mode 6 — GPS Signals

What You See

Phase 1: Searching (no satellites)

  1. 4 points rotating in a circle (search animation)
  2. The device is transmitting: "I am looking for satellites"

Phase 2: Acquisition (satellites appearing)

  1. Discrete points light up as satellites are acquired
  2. First 4: dimmer (center cluster)
  3. Additional satellites: brighter (outer positions)
  4. Up to 12 satellites visible

Phase 3: Stable lock

  1. Static constellation pattern
  2. Brightness reflects signal quality

What It Means

You are not seeing position.

You are seeing radio signals from space, arriving at ~−130 dBm

(approximately one billion times weaker than WiFi signals).

  1. Indoors → searching (no satellites)
  2. Near a window → satellites begin appearing
  3. Outside → stable constellation

Try This:

  1. Start indoors → observe search animation
  2. Move near window → watch first satellites appear (2–5 minutes)
  3. Go outside → full constellation (8–12 satellites)
  4. Block GPS antenna → satellites fade, search resumes

This is one of the most powerful moments of the project.

You are watching invisible radio waves from 20,000 km above Earth.

5.9 Mode 7 — Bluetooth (BLE)

What You See

The device broadcasts its own BLE radio continuously (2.4 GHz).

No connection (default state):

  1. Pulsing dim pattern at center (heartbeat animation)
  2. The device is transmitting, but no receiver is listening

ESP32 monitor connected:

  1. Expanding concentric circles from center
  2. Circle count = signal strength (RSSI)
  3. Brightness = normalized RSSI

What It Means

Invisible 2.4 GHz radio waves — now perceptible.

Try This:

  1. Move ESP32 monitor closer/farther → watch circles change
  2. Walk with the monitor → see RSSI variations
  3. Disconnect monitor → return to pulse pattern


5.10 Mode 8 — Environment (Pressure / Temperature / Humidity)

This is the slowest mode.

And that is the point.

Environmental physics does not jump.

It drifts.

What You See

Three integrated visualizations:

Pressure (circle):

  1. Baseline: medium radius circle
  2. Increasing pressure: circle contracts
  3. Decreasing pressure: circle expands
  4. Radius reflects ±5 hPa range

Temperature (vertical bar):

  1. Blue glow: below baseline (cold)
  2. Neutral: at baseline
  3. Red/orange glow: above baseline (hot)
  4. Color intensity reflects ±10°C range

Humidity (scattered droplets):

  1. Fewer droplets: dry air
  2. More droplets: humid air
  3. Perimeter fills as humidity increases

What It Means

  1. Breathing near the device changes it
  2. Opening a window changes it
  3. Weather approaching changes it

You don't notice this mode.

You sense it over time.

Try This:

  1. Breathe on the device → temperature + humidity spike
  2. Open a window → slow pressure change
  3. Leave running overnight → track atmospheric drift

5.11 Bonus: Detecting Ground Loops (Bad Common Grounds)

The Problem

Ground loops occur when two connected devices have different ground potentials.

This creates parasitic currents that generate electromagnetic interference

across a wide frequency spectrum.

Common causes:

Multiple power supplies with floating grounds

Long cables between devices

Poor grounding in electrical installations

How Phisualize It! Detects Them

Ghost mode(lowered threshold) reveals ambient EMI that would normally be invisible.

When a ground loop is present:

Full matrix saturation with multi-color pattern

8-band spectrum active showing harmonic structure:

- Red (50-125 Hz): Mains frequency (50/60 Hz) and harmonics

- Orange/Yellow (125-375 Hz): Switching noise

- Green/Cyan (375-625 Hz): High-frequency EMI

- Blue/Magenta (625-1000 Hz): Broadband noise

Spectral flatness > 0.45 → Classified as EMI (not audio)

Why This Matters

Identify bad connections before they cause data corruption

Debug audio equipment hum and buzz

Verify power supply isolation

Check cable shielding effectiveness

The Test

Setup intentionally created a ground loop:

Two power supplies with non-common grounds

Connected via data lines

Predictable result: Multi-spectrum EMI burst

Phisualize It! revealed

Not just "noise present" →spectral signature of specific interference

Dominant frequencies match 50 Hz mains + harmonics

Broadband component from switching supplies

System doesn't cheat.

It sees what's actually there.

5.12 One Important Rule

Brightness = Reality

  1. No auto-gain
  2. No adaptive scaling
  3. No visual exaggeration

If a phenomenon is weak, it stays dim.

If it grows, the light grows.

The system never tries to impress you.

It only tells the truth.

5.13 Why Not Show Everything at Once?

Because perception needs contrast.

  1. Mode 0 gives awareness
  2. Individual modes give understanding

Together, they form an instrument — not a light show.

5.14 Suggested Exploration Sequence

For first-time users:

  1. Start with Mode 0 (ALL) — observe ambient baseline
  2. Switch to Mode 1 (IMU) — tap the table
  3. Try Mode 2 (AUDIO) — speak or play music
  4. Compare Mode 3 (EMI) — bring electronic devices near
  5. Test Mode 4 (MAGNETIC) — pass a magnet slowly
  6. Watch Mode 6 (GPS) — place near window, wait
  7. Experiment Mode 8 (ENV) — breathe on the device

The system becomes intuitive after 5–10 minutes of interaction.

What This Step Proves

  1. The system responds in real time
  2. Each mode reflects real physics
  3. One microphone can separate two invisible phenomena (Audio vs EMI)
  4. No pattern exists without a cause

If you interact with it, it reacts.

If you don't, it stays calm.

That calm is part of the demonstration.

Conclusion

WHAT YOU'VE BUILT, AND WHY IT MATTERS

You didn't just build an LED project.

You built a perceptual instrument.

A system that listens to the physical world,

processes it honestly,

and translates it into light — without decoration, without exaggeration, without illusion.

6.1 What You've Actually Built

Technically, this project delivers:

  1. Real-time detection of multiple invisible physical phenomena
  2. A complete signal-processing pipeline (filtering, FFT, classification)
  3. One microphone doing two jobs (Audio and EMI)
  4. Fixed, traceable normalization scales
  5. A display-agnostic visualization engine
  6. Optional wireless monitoring via BLE

All of this runs on a single ~€40 microcontroller.

No cloud.

No PC.

No external processing.

That alone is worth noting.

6.2 Why This Is Different

Most visualization projects fall into one of two traps:

  1. Pretty lights that react to something, but mean nothing
  2. Accurate data locked behind numbers and graphs

Phisualize It! deliberately sits between the two.

It is:

  1. Scientifically rigorous
  2. Immediately perceptible
  3. Visually honest

The rule Brightness = Reality is not aesthetic.

It is ethical.

If something is weak, it stays dim.

If nothing happens, nothing moves.

That restraint is part of the demonstration.

6.2b The Proximity Exception — A Teaching Choice

You may have noticed that Mode 5 (Proximity) behaves differently from other modes.

Other modes:

  1. Stronger phenomenon → brighter LEDs

Proximity mode:

  1. Closer object → dimmer LEDs (after initial increase)

This is not a bug.

The APDS9960 measures infrared reflection, not distance.

At very close range, the sensor saturates (too much reflected IR).

Design choice:

  1. Most projects would invert this signal to "fix" the behavior
  2. This project deliberately does not

Why?

Because sensor limitations are physics too.

Showing raw sensor behavior teaches:

  1. The difference between "measuring a phenomenon" and "inferring a property"
  2. How saturation works in optical sensors
  3. Why calibration and lookup tables exist in industrial systems

Transparency means showing what the sensor actually sees — even when it's counter-intuitive.

This is the only mode where brightness decreases with phenomenon intensity.

And that exception proves the rule everywhere else.

6.3 Frugality as a Design Choice

This project was not built with abundance.

  1. No custom PCB
  2. No expensive sensors
  3. No laboratory hardware

Instead:

  1. Integrated sensors
  2. Recycled enclosures
  3. Minimal wiring
  4. Software doing the hard work

This is engineering under constraint — and that constraint is visible in the result.

It proves that meaningful scientific instruments do not require expensive tools.

They require clear thinking.

6.4 Potential Applications

This system is intentionally open-ended.

Education

Physics teaching: demonstrating FFT and signal processing with tangible feedback

STEM workshops: making abstract concepts (spectral analysis, filters) visible and interactive

Signal processing courses: showing that "noise" contains meaningful information

Creative Installations

Responsive environments that genuinely react to ambient electromagnetic fields

Public installations revealing invisible urban infrastructure (power lines, WiFi, satellites)

Sound installations driven by actual acoustic structure, not just volume

Further Exploration

  1. Testing new spectral classification algorithms (beyond spectral flatness)
  2. Comparative studies of EMI in different environments (urban vs rural, indoor vs outdoor)
  3. Human perception experiments: when does data become awareness?
  4. Investigating vibro-acoustic coupling in different materials and structures

6.5 Make It Yours

Because the architecture is modular, it invites modification rather than consumption.

Beginner Modifications:

  1. Change color palettes in PhiMatrixEngine.cpp
  2. Adjust normalization scales for your local environment
  3. Add new display modes combining multiple phenomena
  4. Modify pattern animations for individual modes

Advanced Modifications:

  1. Port to different LED configurations (rings, strips, larger matrices)
  2. Add new sensors (CO₂, UV, radiation detector)
  3. Implement new classification algorithms (spectral centroid, zero-crossing rate)
  4. Create hybrid modes (e.g., vibro-acoustic coupling visualization)
  5. Build PhiP350 (recycled speaker LEDs — requires LED mapping adaptation)
  6. Design custom enclosures for specific installation contexts

Code Architecture Supports:

  1. Display swapping — only PhiMatrixEngine needs modification
  2. Sensor addition — create new module following existing pattern
  3. Classification tuning — adjust thresholds in PhiScientificModel

6.6 Known Limitations and Future Improvements

Current Limitations:

  1. Environmental sensors (HS3003) have slow response time (~minutes for temperature/humidity)
  2. GPS requires clear sky view — limited indoor functionality
  3. 8×8 resolution limits harmonic visualization detail
  4. Flash/RAM constraints limit FFT size expansion beyond 128 samples

Potential Improvements:

  1. Higher resolution displays (16×16 or circular configurations)
  2. Additional phenomena: UV index, CO₂ concentration, ionizing radiation
  3. Configurable normalization via Serial commands (real-time adjustment)
  4. Data logging to SD card for long-term environmental monitoring
  5. Multi-device synchronization for spatially distributed sensing

6.7 Resources and Downloads

Available on This Instructable:

  1. Complete source code (all .h and .cpp files)
  2. Wiring diagrams
  3. Library list with exact versions tested
  4. Video demonstrations of all display modes

Advanced Builds:

  1. PhiP350 (recycled speaker LEDs, 350+ LEDs) — separate tutorial in development
  2. PhiRing (circular 60-LED configuration) — alternative form factor

Community:

  1. Post your builds in the comments
  2. Share modifications and discoveries
  3. Ask questions — this project is meant to be understood

6.8 Acknowledgments

This project relies on excellent open-source libraries:

  1. arduinoFFT by Enrique Condes (spectral analysis)
  2. Adafruit NeoPixel library (LED control)
  3. Arduino sensor libraries by Arduino team (BMI270, APDS9960, LPS22HB, HS300x)
  4. ArduinoBLE library (wireless communication)

Built with Arduino Mbed OS Nano Boards platform.

6.9 Share Your Build

Built your own Phisualize It?

  1. Post photos in the comments below
  2. Share modifications you've made to the code or hardware
  3. Report interesting phenomena you've discovered in your environment

Found a new use case?

  1. Educational contexts
  2. Art installations
  3. Research applications

This project is designed to evolve through community use.

6.10 Final Thought

The invisible world has always been there.

Vibrations in the floor.

Fields in the air.

Signals from space.

Noise inside silence.

We usually measure it.

Rarely perceive it.

This project is about that difference.

Not just seeing data —

but becoming aware of the physics around you.

You don't just visualize it.

You Phisualize it.

APPENDIX

phisualizeIt_analysis.png
imu_analysis.png

TECHNICAL NOTE ON SAMPLING PERFORMANCE

A.1 System Performance Trade-offs

This project prioritizes accessibility and demonstration of multiple phenomena.

However, there is a measurable trade-off when running all sensors simultaneously.

A.2 IMU Sampling Rate Impact

Theoretical Performance:

  1. Target IMU sampling rate: 90 Hz
  2. FFT resolution: 0.70 Hz per bin
  3. Frequency range: 0–45 Hz

Actual Performance (All Sensors Active):

  1. Achieved IMU sampling rate: 75–80 Hz
  2. Performance loss: ~15%
  3. Cause: Sensor polling overhead (GPS, barometer, humidity, proximity, magnetometer, BLE)

Impact on Scientific Accuracy:

  1. Frequency resolution degradation: 0.70 Hz → ~0.82 Hz per bin
  2. Nyquist range reduction: 45 Hz → ~40 Hz
  3. FFT timing jitter: slightly increased

A.3 Scientifically Rigorous Configuration

For maximum scientific precision focused on the core innovation (Audio/EMI classification + vibration analysis):

Minimal Configuration:

Essential modules only:

  1. PhiIMU (vibrations)
  2. PhiAudio (audio + EMI classification)
  3. PhiScientificModel (FFT + spectral analysis)
  4. PhiMatrixEngine (visualization)

Optional addition:

  1. PhiMagnetometer (performance loss: 0.22%, negligible)

Disable:

  1. PhiGPS
  2. PhiBarometer
  3. PhiHumidity
  4. PhiProximity
  5. PhiBLE (if not monitoring)

Result:

  1. Guaranteed 90 Hz IMU sampling
  2. Full spectral resolution maintained
  3. Deterministic timing
  4. Core innovation preserved

A.4 Implementation: Scientific Mode

To run the scientifically rigorous configuration, comment out sensor updates in the main loop:

void loop() {
// Core (always active)
imu.update();
audio.update(ghostMode);
// Optional: negligible impact
mag.update();
// Disable for maximum precision:
// gps.update();
// baro.update();
// hum.update();
// prox.update();
// Rest of processing...
}

A.5 Why This Matters

For Educational/Demonstration Use:

Full configuration is appropriate

15% loss is acceptable for showing multiple phenomena

Trade-off is transparent and documented

For Research/Precision Work:

Minimal configuration recommended

Eliminates timing variability

Focuses on core spectral analysis innovation

Maintains scientific rigor

A.6 Verification

You can verify actual sampling rate via Serial Monitor:

IMU actual rate: [displayed every 10 seconds]

Expected: 90 Hz

Measured: 75-80 Hz (full config) or 90 Hz (minimal config)

A.7 Design Philosophy

This trade-off reflects the project's dual nature:

  1. As a demonstration instrument:
  2. Show many invisible phenomena → accept minor precision loss
  3. As a scientific tool:
  4. Focus on core innovation → maximize precision

The architecture supports both approaches.

You choose based on your use case.