Balance

“Balance” is my vision of what painting can become in the digital+ age. What began as a mid-morning vibecoding session grew into a deeper exploration of movement, harmony, and emotion—using new materials unlocked by computation. Here, I’m not just working with color and form; I’m sculpting with algorithms, painting with parameters, and harnessing invisible forces like magnetism, gravity, and chaos. Central to this piece is Perlin noise, an algorithm renowned for its ability to generate lifelike textures and patterns, mirroring the subtle motions of nature—flowing rivers, drifting clouds, and shifting sands. By leveraging Perlin noise and tools I developed to balance luminosity in real time, I aimed to create something simultaneously organic and precise, capturing how certain visual effects evoke distinct emotional responses. Created in collaboration with Gemini 2.5 Pro, this artwork is no less Art than any traditional medium. Enjoy the preview below—the full-screen version makes for an excellent ambient projection or screensaver.


Balance. An exercise in human-AI art.

Link to Fullscreen Version


Interaction:

  • Click/Tap & Drag: Pan your view across the infinite canvas.
  • Single Click/Tap: Instantly generate a completely new parameter set and visual style.
  • ‘R’ Key: Regenerate parameters (in the dedicated view).
  • ‘F’ Key: Toggle fullscreen mode (in the dedicated view).

The Search for Equilibrium

The title “Balance” reflects the core dynamic driving the artwork. The system constantly seeks equilibrium between competing forces:

  • Visual Density: Custom adaptive algorithms monitor pixel density, adjusting background fade rates to maintain a target balance between stark black negative space and visible trails, preventing visual clutter while ensuring the piece never feels empty.
  • Brightness Distribution: Similarly, the system analyzes the ratio of bright-to-dim pixels, subtly adjusting particle luminosity to achieve a target contrast level, ensuring a “brilliant” display without overwhelming the viewer.
  • Chaos & Order: The foundation lies in Perlin noise flow fields, providing an organic, pseudo-random structure. Layered on top are systems for parameter morphing (both gradual full-set changes and continuous micro-morphs of individual variables) and localized flow perturbations, introducing controlled chaos and preventing visual stagnation.
  • Autonomy & Interaction: The piece evolves independently over time, but user interaction (panning, regenerating) introduces discontinuities, creating a dialogue between the viewer and the autonomous system.

The goal was to create a perpetually engaging visual field that feels both intentional and emergent, beautiful from afar yet revealing intricate detail upon closer inspection.

Weaving Code into Art

“Balance” is built entirely in JavaScript using the p5.js creative coding library. Key technical features include:

  • Particle System: Thousands of particles navigate a dynamic vector field.
  • Evolving Perlin Noise: Multiple layers of Perlin noise define the flow field’s direction and strength, with parameters that evolve over time using noise and trigonometric functions for organic variation.
  • HSB Color Dynamics: Particle color (Hue, Saturation, Brightness) is dynamically calculated based on velocity, position, and global time, creating flowing color harmonies.
  • Adaptive Density/Brightness Control: A custom feedback loop analyzes a downscaled buffer of the canvas periodically to measure black/bright pixel percentages. This data dynamically adjusts background fade speed and particle brightness multipliers to maintain the target visual “Balance.”
  • Multi-Stage Parameter Morphing:
    • Major Morphs: Every few minutes, the entire set of ~40 controlling parameters smoothly interpolates towards a new, randomly generated target set over ~25 seconds.
    • Micro-Morphs: Between major morphs, individual parameters are randomly selected to subtly drift towards new values over 15-30 seconds, ensuring constant, gentle evolution.
  • Variable Particle Behavior: Particles are initialized with differing drag properties, causing variations in how closely they follow the flow field.
  • Interactive Panning: Viewport translation allows the user to explore the conceptually infinite 2D space.
  • Performance Optimization: Techniques like downscaled buffer analysis for pixel density checks were implemented to maintain smooth frame rates despite complex calculations.

Collaborating with the Ghost in the Machine

This project heavily utilized AI (specifically, large language models like the one generating this text) not just for code generation, but as an iterative collaborator in a process recently dubbed “Vibecoding.”

Vibecoding involves translating aesthetic goals, desired visual “vibes,” and high-level behavioral descriptions into functional code through a dialogue with an AI partner. With over two decades of programming experience and a background in fine arts, I could articulate specific visual targets: “avoid muddy water,” “create a compounding neon glow,” “ensure stark contrast but avoid sparseness,” “introduce chaotic elements without sacrificing performance.”

The process involved:

  1. Defining the Vision: Setting the artistic goals and constraints.
  2. Prompting & Generation: Requesting code structures, algorithms, and parameter ranges from the AI based on the vision.
  3. Integration & Testing: Implementing the AI’s suggestions into the p5.js sketch.
  4. Critique & Refinement: Observing the visual output, identifying deviations from the desired “vibe” (e.g., “muddy water,” “marching ants,” stuttering performance, unwanted scrollbars).
  5. Iterative Dialogue: Providing specific, descriptive feedback to the AI (“The fade is too aggressive,” “We lost the panning feature,” “Need a more robust way to detect clicks vs. drags”) and requesting targeted solutions or alternative approaches.
  6. Debugging & Optimization: Working with the AI to debug emergent behaviors and optimize performance-critical sections (like the pixel analysis).

This iterative loop allowed for rapid exploration of complex visual ideas. The AI acted as a highly skilled, tireless coding assistant, capable of suggesting multiple technical approaches (like MULTIPLY blend modes vs. alpha fading, or different pixel analysis strategies). However, the artistic direction, the definition of the desired “vibe,” the critical eye, and the ability to guide the AI towards the specific aesthetic goal remained firmly human. Wielding the AI effectively required translating artistic intent into precise technical prompts and evaluating the output against that intent.

A Synergy of Disciplines

“Balance” stands as a testament to the power of combining artistic sensibility with deep technical expertise. My fine arts background provided the visual language and aesthetic goals, while my programming experience enabled the implementation, debugging, and optimization of the complex generative systems required. This project highlights the unique synergy possible when these two disciplines intersect, particularly when amplified by collaborative AI tools.

Mastodon