motion-induced blindness

A fascinating illusion. Stare at the green dot for a bit.
More info here

This is just a remake of the wikipedia GIF animation, but this JavaScript code and its variables/settings opens up for experimentation.

<div style="background-color:black;">
<canvas id="can2" width="600" height="600"></canvas>
//motion-induced blindness after
var rotationRate= 0.1;  //rps
var blinkRate= 2.5;  //Hz
var numCrosses= 7;
var numDots= 3;
var dotRadius= 5;
var crossWidth= 0.1;  //percent
var crossWeight= 3;
var colorBackground= '#000';
var colorCrosses= '#00F';
var colorCenter= '#0F0';
var colorDots= '#FF0';
var can2, ctx2;
(function() {
    can2= document.getElementById('can2');
    ctx2= can2.getContext('2d');
function draw2() {
    var w2= can2.width*0.5;
    var h2= can2.height*0.5;
    var uw= can2.width/3;

    ctx2.fillStyle= colorBackground;
    ctx2.fillRect(0, 0, can2.width, can2.height);

    ctx2.translate(w2, h2);
    ctx2.lineWidth= crossWeight;
    ctx2.strokeStyle= colorCrosses;
    for(var i= 0; i<numCrosses; i++) {
        var y= i*(uw*2)/(numCrosses-1)-uw;
        for(var j= 0; j<numCrosses; j++) {
            var x= j*(uw*2)/(numCrosses-1)-uw;
            ctx2.moveTo(x-(crossWidth*uw), y);
            ctx2.lineTo(x+(crossWidth*uw), y);
            ctx2.moveTo(x, y-(crossWidth*uw));
            ctx2.lineTo(x, y+(crossWidth*uw));

    if ((*blinkRate)%1>0.5) {
        ctx2.fillStyle= colorCenter;
        ctx2.ellipse(w2, h2, dotRadius, dotRadius, 0, 0, Math.PI*2);

    ctx2.translate(w2, h2);
    ctx2.fillStyle= colorDots;
    for (var i= 0; i<numDots; i++) {
        ctx2.ellipse(can2.width/4, 0, dotRadius, dotRadius, 0, 0, Math.PI*2);


Also attached is the same code ported to Processing and SuperCollider.

Binary Data processing1.39 KB
Binary Data supercollider1.57 KB


A much improved version of my old MaxMSPJitter application anneVideotracking

With this version you can use 12 zones on a webcamera video input to trigger midi, soundfiles, audio input (mics) and osc messages (send to SuperCollider for example).
The zones include filters and different types of thresholds and calibration. The data can be on/off triggers and or continuous values.

(sorry for the terrible gui design.)

Download the macOS standalone from here.

and here's the updated SuperCollider example that demonstrates how to use the OSC data to control some sine oscillators.

//to start: select all & cmd+enter
//to stop: cmd + .
n= 12;
s.latency= 0.05;
        var dlast= 0.dup(n);
        d= {Bus.control(s, 1)}.dup(n);
        e= {Bus.control(s, 1)}.dup(n);
                var index= m[1], val= m[2], diff= (val-dlast[index]).abs;
                dlast.put(index, val);
        }, \anneVideoTracking);
        SynthDef(\annetest, {
                var src= Mix({|i|*100+400, 0,[i].index), 0.01, 0.1))}.dup(n));


Last summer I wrote code to talk to the YDLIDAR X4 Lidar – 360-degree Laser Range Scanner (10m). It was quite difficult to parse the data and get correct readout of the point cloud. The X4_Lidar_Development_Manual.pdf had all the information but it was quite obscure.


I also added tracking (red cross in the screenshot) to figure out coordinates of a person moving around in a room. Nothing fancy but worked ok.

The attached code is for Unity3d and written in C#.

Binary Data Lidar.cs15.5 KB


Just some hypnotic graphics...

The JavaScript code above is this...

<div style="background-color:black;">
<canvas id="can" width="800" height="600"></canvas>
var width, height;
var ctx, frameCount= 0;
(function() {
    var can= document.getElementById('can');
    ctx= can.getContext('2d');
    width= can.width;
    height= can.height;
    ctx.fillStyle= '#FFF';
function draw() {
    ctx.clearRect(0, 0, width, height);;
    ctx.translate(width*0.5, height*0.5);
    var theta= Math.sin(frameCount*0.001)*Math.PI*2*4;
    for(var y= 0; y<height; y++) {
        for(var i= 0; i<10; i++) {
            ctx.fillRect((Math.sin(y*0.1+theta+(i*2))*100), y, 2, 2);
    frameCount= frameCount+1;

Originally this was a quick sketch made in Processing...

//spiral.pde - processing
void setup() {
  size(800, 600);
void draw() {
  translate(width*0.5, height*0.5);
  float theta= sin(frameCount*0.001)*TWO_PI*4;
  for(int y= 0; y<height; y++) {
    for(int i= 0; i<10; i++) {
      rect((sin(y*0.1+theta+(i*2))*100), y, 2, 2);

And then ported to SuperCollider...

//spiral.scd - supercollider
var width= 800, height= 600;
var win= Window("spiral", Rect(100, 100, width, height), false);
var usr= UserView(win, Rect(0, 0, width, height));
usr.animate= true;
usr.drawFunc= {
        var theta= sin(usr.frame*0.001)*2pi*4;
        Pen.fillColor= Color.white;
        Pen.translate(width*0.5, height*0.5);{|y|
                        Pen.fillRect(Rect(sin(y*0.1+theta+(i*2))*100, y, 2, 2));

more processing tweets

Three new Processing tweets...


int s=900,i;void setup(){size(1200,s);strokeWeight(99);}void draw(){stroke(9,9);ellipse(i++%1200,millis()%750+99,i%s/350.,(20+i)%99);}// #p5


int j,i;void setup(){size(1024,768);}void draw(){translate(512,384);i=frameCount;while(i-->1){rect(j++%i,j%i,3,i/9%9);rotate(0.009);}}// #p5


float j=433,i=9e3;size(1024,768,P3D);fill(9,99);beginShape();while(i>0){vertex(sin(i--/99)*j+j,sin(i/j/8)*j*2,cos(i*2)*j);}endShape();// #p5


and more previously

p5 tweets

Constrains - I love them. Inspired by Abe's twitter experiments, I've also played with creating small one line Processing programs that are 140 characters long.

Below is a video of number 0002, the twitter code tweets and screenshots. Note that many but not all are animations. Copy and paste the lines into Processing (2.0) to try them out.

p5tweet0002 from redFrik on Vimeo.


int i;noStroke();size(999,900);for(i=0;i<999;i++){fill(255,0,0,9);rect(i%99,i,i,i);}for(i=0;i<999;i++){fill(0,200,0,3);rect(i,i,i,i);}// #p5


int j,i=0;void setup(){size(1200,900,P3D);frameRate(999);}void draw(){for(j=0;j<99;)rect(i++%(1199-j++),int(i/99)%(999-j),i%12,j%16);}// #p5


int s=900,j,i=j=0;void setup(){size(s,s);fill(0,9);textSize(80);}void draw(){text(i+j,(sin(i++)/3+0.3)*s,(cos(j+++(i/4e3))/3+0.5)*s);}// #p5


int s=900,j,i=j=0;void setup(){size(s,s);stroke(255,9);fill(9,3);}void draw(){quad(i++,j++,j,i,s-i,i-50,s-j,j);i=(i<<j%4)%1200;j=j%s;}// #p5


int s=900,i=0;void setup(){size(s,s,P3D);stroke(255,0,0);fill(10,4);}void draw(){translate(i++%s,s/2);rotate(float(i)/s);sphere(i%s);}// #p5


int s=900;float i=0;void setup(){size(s,s,P3D);stroke(99,9);fill(0,2);}void draw(){translate(i++%s,s/2);rotate(cos(i/50));box(i%s/3);}// #p5


background(0);noStroke();for(float i=0;i<99;i=i+0.0252){size(1200,900,P3D);fill(255,0,0,60);translate(i+9,i);rotate(i*1.8);sphere(i);}// #p5


void setup(){size(1600,900);background(255);}void draw(){textSize(millis()%1200);fill(second()*4,0,0,second());text(millis(),10,880);}// #p5


float j,i=0;void setup(){size(1200,900,P3D);}void draw(){for(j=0;j<133;j++){rect(9*j+1,sin((i+++j)*0.75/cos(j/99)/5e3)*99+450,9,9);};}// #p5


float i,k=450;void setup(){size(900,900,P3D);textSize(k);}void draw(){translate(k,k);fill(i%1*k/2,60);rotate(i+=+.01);text("$",99,0);}// #p5


int i,j,k=1200;void setup(){size(k,900);fill(255,200);}void draw(){background(0);for(i=0;i<8e3;)text(i++*j/k%k,i%131*9,i/131*16);j++;}// #p5


int j=200,i=900;size(j*6,i,P3D);lights();translate(700,540);for(i=1;i<j;){fill(i/2,50);rotate(j/i);translate(i,i,-2.7);sphere(i+++j);}// #p5


int j=480,i=900;size(j*3,i,P3D);noStroke();lights();translate(660,j);for(i=1;i<j;){fill(i,0,0,10);rotate(i/4e4,1.1,2.2,3.3);box(i++);}// #p5


int s=900,i=0;void setup(){size(1200,s,P3D);}void draw(){translate(600,450);rotateX(i*.0021);fill(i++%256,30);sphere(sin(i*.0014)*s);}// #p5


int i,s=900;void setup(){size(s,s);frameRate(1e4);stroke(255,25);}void draw(){fill(i++%89,0,0,127);rect(i%90*9,i%91*9,i*i%92,i*i%93);}// #p5


int i,s=900,t=1200;void setup(){size(t,s);noStroke();}void draw(){fill(i++%256,25);quad(i%t,i/3%s,i/4%t,i%s,i/5%t,i/4%s,i/3%t,i/2%s);}// #p5


int t=0;void setup(){size(900,900);background(0);stroke(255,9);}void draw(){translate(450,450);line(sin(t)*421,cos(t++)*400,t%9,t%9);}// #p5


int s=900;size(1600,s);fill(255,9);while(s>9){rotate(1e-3);arc(s+420,s,s,s,0,7);arc(1000-s,s+100,s,s,0,7);arc(s+500,400-s,s,s--,0,4);}// #p5


int i,j,s=900;void setup(){size(s,s,P3D);smooth(8);}void draw(){stroke(i);line(i,j,s-j,i);if(j%5<1){i=(i+1)%s;}if(i%11<1){j=(j+i)%s;}}// #p5


int s=900;void setup(){size(1200,s,P3D);}void draw(){fill(s,50);translate(sin(s)*110+600,cos(s)*99+450);rotate(s);box(s);s=(s+1)%900;}// #p5


cheap 4-channel videoplayer

For the dance piece Ich(a) by Zufit Simon I constructed a system with four Raspberry Pi mini-computers and buttons to trigger playback of four video streams. As the videos didn't need to run in exact frame-by-frame sync, this was a very cheap way to get four channel high-quality video playback. Total cost was about (RPi 28*4)+(SD card 6*4)+(5V power 1*7) ≈ 141 Euro. I chose the model A of the Raspberry Pi to keep the cost and power consumption down. The four computers share a 5V power supply of 2 amps and are powered over the GPIO pins. Video cables run 50 meters down to the stage and in to separate flat screen monitors. The monitors are built in to boxes that can be piled up or rolled around independently.

The videos are stored on the 4 Gb SD cards that also holds the Linux operating system. I converted the videos from DVD to MP4 using ffmpeg with the following settings...

ffmpeg -i concat:"/Volumes/MONITOR01_may2012_DVD/VIDEO_TS/VTS_01_1.VOB|/Volumes/MONITOR01_may2012_DVD/VIDEO_TS/VTS_01_2.VOB" -an -vcodec libx264 -profile:v high -preset fast -crf 18 -b-pyramid none -f mp4 MONITOR01_may2012.mp4

That'll take two chapters and convert to a single MP4 and skip the sound track (-an flag).

The Python program running on each computer is here below. It plays a video to the end and waits for a button trigger. If a button is pressed before the video is finished, it'll stop and jump to the next video - all in a cyclic fashion.
#for a Raspberry Pi running Raspbian
#this script will cycle through videos in sequence when a GPIO pin is grounded

#pinSwi (pulled up internally) - gnd this pin to switch to the next video
#pinOff (pulled up internally) - gnd this to shut down the system

videos= ['/home/pi/ICHA1.mp4', '/home/pi/MONITOR01_may2012.mp4', '/home/pi/BLACK.mp4', '/home/pi/FLESH.mp4', '/home/pi/TESTBILDER.mp4']
delays= [0, 0, 0, 0, 0] #extra start delay time in seconds - one value for each video
pinSwi= 23
pinOff= 24

import pexpect
from time import sleep
import RPi.GPIO as GPIO
import os
GPIO.setup(pinSwi, GPIO.IN, pull_up_down= GPIO.PUD_UP)
GPIO.setup(pinOff, GPIO.IN, pull_up_down= GPIO.PUD_UP)

def main():
        os.system("clear && tput civis")        #clear and hide cursor
        index= 0        #keeps track of which video to play
        while True:
                omx= pexpect.spawn('/usr/bin/omxplayer -rp '+videos[index])
                omx.expect('Video')     #play
                        if GPIO.input(pinOff)==False:
                                omx.send('q')   #quit
                                os.system("tput cnorm && sudo halt")
                omx.send('q')   #quit
                sleep(0.5)              #safety
                index= (index+1)%len(videos)

if __name__ == "__main__":

//--Instructions for installing (you'll need a model B to prepare a SD card, but then move it over to the model A Raspberry Pi)

//--prepare the RPi
* use Pi Filler to transfer 2013-05-25-wheezy-raspbian.img to the SD card
* put the SD card in RPi model B
* select 'Expand Filesystem' in and enable SSH under advanced in config menu
* select 'Finish' and reboot
* log in with pi/raspberry
* sudo apt-get update
* sudo apt-get upgrade
* sudo apt-get install python-pexpect avahi-daemon

//--copy files from macOS
* open a terminal window on main computer
* cd to folder with videos
* edit the file and select which videos to use
* optionally add delaytimes if some videos should start later
* scp MONITOR01_may2012.mp4 ICHA1.mp4 BLACK.mp4 FLESH.mp4 TESTBILDER.mp4 pi@raspberrypi.local:/home/pi/

//--back to model B
* sudo pico /etc/rc.local
* add the following before the exit line: (sleep 1; python /home/pi/ & # autostart video player
* press ctrl+o to save and ctrl+x to exit
* sudo halt

//--start model A
* take out the SD card from model B and put it in model A
* connect HDMI or composite video, GPIO pins and apply power - the first video should start
* ground pin 23 to cycle through the videos
* ground pin 24 to turn off the computer

//--Useful commands (connect keyboard to RPi model A, type pi/raspberry to log in)
sudo pkill omxplayer.bin     #might need to write this without the terminal being visible

ssh-keygen -R raspberrypi.local     #useful for resetting ssh/scp after changing SD cards

It's not pretty but it's working. Some day I'll build it in to a real rackmount box.



Since the category 'visuals' is underrepresented in this blog and I don't like to embed video in my standard [html] pages, I thought I'd include this old piece here. This is the shorter abridged version of the full piece. The quality isn't the best - it's pixelated and stuttering. One day I should re-render it in 60 FPS at a higher resolution. It looks a lot better when running in realtime from a computer.

strömSA4 from Mattias Petersson on Vimeo.

Ström by Mattias Petersson (music) and Fredrik Olofsson (video) is, in its full version, a 45 minute minimalistic piece for five loudspeakers, live-electronics and live-video, based on an open-minded, artistic approach towards electricity. The piece is an attempt to transfer electric currents via sound to the audience. The five speakers in the surround system struggles to take over the sonic stream like electro-magnets. Sine waves and noise rotates with breakneck speeds around the listeners, tries to charge them with static electricity and, as an ultimate goal, even make them levitate. The video part is in direct connection with the sound and is generated out of five discrete lines – one for each channel in the surround system. The lines are treated in different ways, and as the high voltage builds up in the music they look more and more like electric wires, inflicting each other with violent discharges and eruptions. This version was made for a promotional DVD release on Swedish sound art.

Also see here


Subscribe to RSS - visuals