Quantcast
Channel: Library Questions - Processing 2.x and 3.x Forum
Viewing all 2896 articles
Browse latest View live

Help with client ip adress

$
0
0

How do I get the ip adress of a client,
because client.ip() returns the ip adress of the connected server.

when i create a server event handler,

    public void serverEvent(Server server, Client someClient) {
        System.out.println("We have a new client: " + someClient.ip());
    }

it prints the ip adress of the client, but is it possible
to create something like this on the client side?


Drop library drag-and-drop not working in P2D or P3D mode

$
0
0

sDrop works fine with the default renderer, but if the sketch is running in P2D or P3D mode, drag-n-drop does not work anymore, for example :

size(400, 400, P3D);

Any workaround for this?

Memory leak or misunderstanding? (IPCapture)

$
0
0

Hi,

The program below grabs frames from an MJPEG camera stuffs them in a circular buffer and displays them with a predefined delay.

While visually this behaves as expected the problem I have is that memory usage continuously increases while the program runs and eventually stops when it reaches the limit.

The IPcapture library example grabs the frames by reading the cam1 instance directly as so: buffer[x] = cam1;

But I've had to change this slightly because it would always show the last frame regardless of where in the buffer I was reading from. My guess is that assigning this way is similar to using a pointer to the actual IPcapture object and not a copy of its image data. So I did this instead: buffer[x] = cam1.get();

This works as expected making a copy of the frame and storing it in the PImage buffer. The problem I have then is that it seems to be making a copy of the whole IPcapture object and not just the image data taking up more memory every time I copy it to the buffer.

Is this a problem with the way the library works or am I not doing this properly? Any input appreciated. Thanks!

import ipcapture.*;

IPCapture cam1;

// Circular buffer
PImage[] cam1_buffer;

// This determines the size of the circular buffe
int nFrames = 500;
// These are used to iterate through the buffer while writing and reading
// the read counter should always be at least one frame ahead of the write counter
int iWrite = 0, iRead = 1;

void setup() {
  fullScreen();

  // Set the cam's URL and start it
  cam1 = new IPCapture(this, "http://" + "192.168.0.251:88/cgi-bin/CGIStream.cgi?cmd=GetMJStream&usr=root&pwd=root", "root", "root");
  cam1.start();
  cam1.pixelWidth = 640;
  cam1.pixelHeight = 480;

  // Declare a frame buffer sized accordingly to our desired number of frames
  cam1_buffer = new PImage[nFrames];
}

void draw() {
  // Gets a frame from the camera and store it in our buffer
  if (cam1.isAvailable()) {
    cam1.read();
    cam1_buffer[iWrite] = cam1.get();
    // Reads a frame from our buffer
    if(cam1_buffer[iRead] == null){
      // Display this text if the buffer isn't full yet.
      clear();
      fill(255);
      text("BUFFERING " + str(iWrite) +" / " + str(nFrames),300,10);
    }
    else{
      // Here we display our delayed frames
      image(cam1_buffer[iRead],0,0);
    }

    // Increment write and read counters
    iWrite++;
    iRead++;

    // Start writing over the begining of our buffer every time we reach the end
    if(iRead >= nFrames-1){
      iRead = 0;
    }

    if(iWrite >= nFrames-1){
      iWrite = 0;
    }
  }
}

How can I make it sound when I hit the keyboard not only when I entered.

$
0
0

Hi guys. I got some trouble in progress.

import ddf.minim.*;
import processing.video.*;

Movie myMovie;
Minim minim;
AudioPlayer s1, s2, s3, s4, s5, s6, s7, s8, s9, s10, s11, s12, s13, s14, bg1, bg2, bg3, bg4;

PImage bg;
PFont font;
int num = 500;
char t[] = new char[num];
int key_num = 0;
int text_w_gap = 40;
int text_h_gap = 70;
int ypos = 450;
int xpos = 510;
int val;
int notetime = 795;
//730 notetime for bg1 - single note
//235 notetime for bg1 fast tempo
//580 notetime for bg2 lick
//1300 notetime for bg3
// for play
int startTime;
int iForPlaySong=0;
boolean playSong=false;

void setup() {
  fullScreen();
  //size(1440, 900);
  bg = loadImage("paper.jpg");
  font = createFont("Albertsthal Typewriter.ttf", 14);
  textFont(font, 64);
  fill(40);
  minim = new Minim(this);

  s1 = minim.loadFile("1.mp3");
  s2 = minim.loadFile("2.mp3");
  s3 = minim.loadFile("3.mp3");
  s4 = minim.loadFile("4.mp3");
  s5 = minim.loadFile("5.mp3");
  s6 = minim.loadFile("6.mp3");
  s7 = minim.loadFile("7.mp3");
  s8 = minim.loadFile("8.mp3");
  s9 = minim.loadFile("9.mp3");
  s10 = minim.loadFile("10.mp3");
  s11 = minim.loadFile("11.mp3");
  s12 = minim.loadFile("12.mp3");
  s13 = minim.loadFile("13.mp3");
  s14 = minim.loadFile("14.mp3");
  bg1 = minim.loadFile("bg1.mp3");
  bg2 = minim.loadFile("bg2.mp3");
  bg3 = minim.loadFile("bg3.mp3");
  bg4 = minim.loadFile("bg4.mp3");
}

void draw() {

  background(bg);
  text("Like I said", 300, 110);
  text("Say what you wanna say", 300, 210);

  if (playSong) {
    if ( millis()>startTime+notetime) {
      if (iForPlaySong < key_num) {
        println("play " + t[iForPlaySong]);
        playNote(t[iForPlaySong]);
        iForPlaySong++;
        startTime=millis();
      }
    }
  } else
  {
    for (int i=0; i<key_num; i++) {
      text(t[i],
        (xpos+(i*text_w_gap))%width,
        ypos+( (int(xpos+(i*text_w_gap))) / width * text_h_gap));
    }
    if (key_num >= num) {
      key_num = 0;
    }
  }
}

void keyPressed() {
  if  (key==8) {
    key_num--;
  } else if (key==RETURN||key==ENTER) {
    println("Here 11111111111111111111111");
    playSong=true;

    bg2.play();   // for the bgm --------------------------------------- here

    startTime=0;
  } else {
    t[key_num] = key;
    key_num++;
  }
  println("pressed " + int(key) + " " + keyCode);
}

void playNote(char val) {
  if (val=='q') {
    s1.play();
    if (s1.position() > notetime )
    {
      s1.rewind();
    }
  }
  if (val=='w') {
    s2.play();
    if (s2.position() > notetime )
    {
      s2.rewind();
    }
  }
  if (val=='e') {
    s3.play();
    if (s3.position() > notetime )
    {
      s3.rewind();
    }
  }
  if (val=='r') {
    s4.play();
    if (s4.position() > notetime )
    {
      s4.rewind();
    }
  }
  if (val=='t') {
    s5.play();
    if (s5.position() > notetime )
    {
      s5.rewind();
    }
  }
  if (val=='a') {
    s6.play();
    if (s6.position() > notetime )
    {
      s6.rewind();
    }
  }
  if (val=='s') {
    s7.play();
    if (s7.position() > notetime )
    {
      s7.rewind();
    }
  }
  if (val=='d') {
    s8.play();
    if (s8.position() > notetime )
    {
      s8.rewind();
    }
  }
  if (val=='f') {
    s9.play();
    if (s9.position() > notetime )
    {
      s9.rewind();
    }
  }
  if (val=='g') {
    s10.play();
    if (s10.position() > notetime )
    {
      s10.rewind();
    }
  }
  if (val=='z') {
    s11.play();
    if (s11.position() > notetime )
    {
      s11.rewind();
    }
  }
  if (val=='x') {
    s12.play();
    if (s12.position() > notetime )
    {
      s12.rewind();
    }
  }
  if (val=='c') {
    s13.play();
    if (s13.position() > notetime )
    {
      s13.rewind();
    }
  }
  if (val=='v') {
    s14.play();
    if (s14.position() > notetime )
    {
      s14.rewind();
    }
  }
  // ---------------------------------- LEFT SIDE --------------------------------
  if (val=='y') {
    s1.play();
    if (s1.position() > notetime )
    {
      s1.rewind();
    }
  }
  if (val=='u') {
    s2.play();
    if (s2.position() > notetime )
    {
      s2.rewind();
    }
  }
  if (val=='i') {
    s3.play();
    if (s3.position() > notetime )
    {
      s3.rewind();
    }
  }
  if (val=='o') {
    s4.play();
    if (s4.position() > notetime )
    {
      s4.rewind();
    }
  }
  if (val=='p') {
    s5.play();
    if (s5.position() > notetime )
    {
      s5.rewind();
    }
  }
  if (val=='h') {
    s6.play();
    if (s6.position() > notetime )
    {
      s6.rewind();
    }
  }
  if (val=='j') {
    s7.play();
    if (s7.position() > notetime )
    {
      s7.rewind();
    }
  }
  if (val=='k') {
    s8.play();
    if (s8.position() > notetime )
    {
      s8.rewind();
    }
  }
  if (val=='l') {
    s9.play();
    if (s9.position() > notetime )
    {
      s9.rewind();
    }
  }
  if (val=='b') {
    s10.play();
    if (s10.position() > notetime )
    {
      s10.rewind();
    }
  }
  if (val=='n') {
    s11.play();
    if (s11.position() > notetime )
    {
      s11.rewind();
    }
  }
  if (val=='m') {
    s12.play();
    if (s12.position() > notetime )
    {
      s12.rewind();
    }
  }
  if (val=='.') {
    s13.play();
    if (s13.position() > notetime )
    {
      s13.rewind();
    }
  }
  if (val=='?') {
    s14.play();
    if (s14.position() > notetime )
    {
      s14.rewind();
    }
  }
}

This is the code.

I want to make 2 things change.

  1. Make it play the sound not only when I hit the enterkey, also make sound when I hit the key.

2.When I done the interaction, how can I make it run again like 1 minute later?

I got some more problem now, but I want to solve this first. :(

Thanks for watching.

showing an image and replacing it with another image if something happens

$
0
0

hey, i use a color tracking code for processing. what i want: eg. if red is detected show image 1 if green is detected show image 2 if blue is detected show image 3

the problem is: if the last color is detected and the last image is shown, and i track now the first color the first image is not in front (i cant see it)

the whole code:

 import processing.video.*;
//import hypermedia.net.*;


PImage img;
PImage img2;
PImage img3;

Capture video;
final int TOLERANCE = 20;

float XRc = 0;// XY coordinate of the center of the first target
float YRc = 0;
float XRh = 0;// XY coordinate of the center of the second target
float YRh = 0;
float XRc2 = 0; // XY coordinate of the center of the third target
float YRc2 = 0;
float XRh2 = 0;// XY coordinate of the center of the fourth target
float YRh2 = 0;

int ii=0; //Mouse click counter

color trackColor; //The first color is the center of the robot
color trackColor2; //The second color is the head of the robot
color trackColor3; //The first color is the center of the robot 2
color trackColor4; //The first color is the center of the robot 2

void setup() {
img = loadImage("IMG_4700.JPG");
img2 = loadImage("2.JPG");
img3 = loadImage("3.JPG");
size(800,800);
video = new Capture(this,640,480);
video.start();

trackColor = color(94,164,126);
trackColor2 = color(60,110,194);
trackColor3 = color(197, 76,64);
trackColor4 = color(255,0,0);
smooth();
}

void draw() {
background(0);
if (video.available()) {
    video.read();
}

video.loadPixels();
image(video,0,0);

  float r2 = red(trackColor);
  float g2 = green(trackColor);
  float b2 = blue(trackColor);

  float r3 = red(trackColor2);
  float g3 = green(trackColor2);
  float b3 = blue(trackColor2);

  float r4 = red(trackColor3);
  float g4 = green(trackColor3);
  float b4 = blue(trackColor3);

  float r5 = red(trackColor4);
  float g5 = green(trackColor4);
  float b5 = blue(trackColor4);


  int somme_x = 0, somme_y = 0; // pour le calcul des baricentres
  int compteur = 0;

  int somme_x2 = 0, somme_y2 = 0; // pour le calcul des baricentres
  int compteur2 = 0;

  int somme_x3 = 0, somme_y3 = 0; // pour le calcul des baricentres
  int compteur3 = 0;

  int somme_x4 = 0, somme_y4 = 0; // pour le calcul des baricentres
  int compteur4 = 0;


  for(int x = 0; x < video.width; x++) {
    for(int y = 0; y < video.height; y++) {

      int currentLoc = x + y*video.width;
      color currentColor = video.pixels[currentLoc];

      float r1 = red(currentColor);
      float g1 = green(currentColor);
      float b1 = blue(currentColor);


      if(dist(r1,g1,b1,r2,g2,b2) < TOLERANCE) {
         somme_x += x;
         somme_y += y;
        compteur++;
      }

      else if(compteur > 0) {
        XRc = somme_x / compteur;
        YRc = somme_y / compteur;
      }


      if(dist(r1,g1,b1,r3,g3,b3) < TOLERANCE) {
         somme_x2 += x;
         somme_y2 += y;
        compteur2++;
      }

      else if(compteur2 > 0) {
        XRh = somme_x2 / compteur2;
        YRh = somme_y2 / compteur2;
      }

      if(dist(r1,g1,b1,r4,g4,b4) < TOLERANCE) {
         somme_x3 += x;
         somme_y3 += y;
        compteur3++;
      }

      else if(compteur3 > 0) {
        XRc2 = somme_x3 / compteur3;
        YRc2 = somme_y3 / compteur3;
      }

      if(dist(r1,g1,b1,r5,g5,b5) < TOLERANCE) {
         somme_x4 += x;
         somme_y4 += y;
        compteur4++;
      }

      else if(compteur4 > 0) {
        XRh2 = somme_x4 / compteur4;
        YRh2 = somme_y4 / compteur4;
      }

  }
  }


// track the color and show images
boolean c1 = false;
boolean c2 = false;
boolean c3 = false;


  if(XRc != 0 || YRc != 0) { // color Green detected
    c1 = true;
    c2 = false;
    c3 = false;
   }


   if(XRh != 0 || YRh != 0) { // color blue detected
    c2 = true;
    c1 = false;
    c3 = false;
   }

    if(XRc2 != 0 || YRc2 != 0) { // color red detected
      c3 = true;
      c1 = false;
      c2 = false;
    }


     if(c1 == true) {
       image(img,0,0); // show image 1
      } else if (c2 == true) {
       image(img2,0,0); // show image 2
     } else if (c3 == true) {
       image(img3,0,0); // show image 3
     }

}

The important snippet:

    // detect color and show images
    boolean c1 = false;
    boolean c2 = false;
    boolean c3 = false;


      if(XRc != 0 || YRc != 0) { // color Green detected
        c1 = true;
        c2 = false;
        c3 = false;
       }


       if(XRh != 0 || YRh != 0) { // color blue detected
        c2 = true;
        c1 = false;
        c3 = false;
       }

        if(XRc2 != 0 || YRc2 != 0) { // color red detected
          c3 = true;
          c1 = false;
          c2 = false;
        }


         if(c1 == true) {
           image(img,0,0); // show image 1
          } else if (c2 == true) {
           image(img2,0,0); // show image 2
         } else if (c3 == true) {
           image(img3,0,0); // show image 3
         }

screenshots: first object is tracked and image is shown 1-object-tracked

second object is tracked and image is shown 2-object-tracked

third object is tracked and image is shown 3-object-tracked

Now my problem: (the first object should be tracked and the first image should me shown) 1object-tracked-again

THX for help :)

Unfolding Maps: Can't initialize a map and set the size/position

$
0
0

Hi All,

I'm relatively new at this so please be kind, I'm sorry if this is a super dumb question.

I am trying to make a basic map using the unfolding maps library (which I intend to incorporate into another sketch once I can get it working). I have been able to get the map running at full window size, but I need to scale it down. From everything I have read this should be relatively simple- just some position and size parameters added to the initializer. However, when I do this (no matter what parameters I use) the map goes to position: 0,0 and size: 500 x 500px (approx). (see image)

Here is the code I've got:

import de.fhpotsdam.unfolding.*;
import de.fhpotsdam.unfolding.core.*;
import de.fhpotsdam.unfolding.data.*;
import de.fhpotsdam.unfolding.events.*;
import de.fhpotsdam.unfolding.geo.*;
import de.fhpotsdam.unfolding.interactions.*;
import de.fhpotsdam.unfolding.mapdisplay.*;
import de.fhpotsdam.unfolding.mapdisplay.shaders.*;
import de.fhpotsdam.unfolding.marker.*;
import de.fhpotsdam.unfolding.providers.*;
import de.fhpotsdam.unfolding.texture.*;
import de.fhpotsdam.unfolding.tiles.*;
import de.fhpotsdam.unfolding.ui.*;
import de.fhpotsdam.unfolding.utils.*;
import de.fhpotsdam.utils.*;
import de.fhpotsdam.unfolding.providers.Google;

AbstractMapProvider p1 = new Google.GoogleSimplifiedProvider();
UnfoldingMap map;

void settings() {
  size(800, 800, P2D);

  map = new UnfoldingMap(this, 100, 100, 100, 100, p1 ); //this line positions the map at 0,0 and around 500,500 size-see above image
  //map = new UnfoldingMap(this, p1 ); //this works properly but the map is full screen
}

void setup() {

  //map = new UnfoldingMap(this, p1 ); if I try to put this line here I get this error: java.lang.NoSuchFieldError:quailty (unless I removed P2D from size)
  Location melbourneLocation = new Location(-37.815924f, 144.966815f);
  float maxPanningDistance = 0.5; // in km
  map.zoomAndPanTo(15, melbourneLocation);
  map.setPanningRestriction(melbourneLocation, maxPanningDistance);
  map.setZoomRange(14, 18);
  MapUtils.createDefaultEventDispatcher(this, map);
}

void draw() {

  map.draw();
}

Any help would be amazing!

urgent question about video related code! help me..

$
0
0

This is the main tab for my code.

import ddf.minim.*;
import processing.video.*;

Movie mov;
Minim minim;
AudioPlayer s1, s2, s3, s4, s5, s6, s7, s8, s9, s10, s11, s12, s13, s14, bg1, bg2, bg3, bg4, currentBackingTrack;;

PImage bg;
PFont font;
int num = 500;

// the tune
char t[] = new char[num];

int key_num = 0;

int val;
int notetime;
int currentBackingTrackDelay;

// for play
int startTime=0;
int startTimeBGM=0;
int iForPlaySong=0;
boolean playSong=false;

// for video
boolean mo = true;

int countPlaying = 0;

void setup() {
  fullScreen();

  bg = loadImage("paper.jpg");
  bg.resize(displayWidth, displayHeight);
  mov = new Movie(this, "hello.mov");
  mov.play();

  font = createFont("Albertsthal Typewriter.ttf", 64);
  textFont(font, 64);
  fill(40);

  minim = new Minim(this);

  s1 = minim.loadFile("1.mp3");
  s2 = minim.loadFile("2.mp3");
  s3 = minim.loadFile("3.mp3");
  s4 = minim.loadFile("4.mp3");
  s5 = minim.loadFile("5.mp3");
  s6 = minim.loadFile("6.mp3");
  s7 = minim.loadFile("7.mp3");
  s8 = minim.loadFile("8.mp3");
  s9 = minim.loadFile("9.mp3");
  s10 = minim.loadFile("10.mp3");
  s11 = minim.loadFile("11.mp3");
  s12 = minim.loadFile("12.mp3");
  s13 = minim.loadFile("13.mp3");
  s14 = minim.loadFile("14.mp3");

  bg1 = minim.loadFile("bg1.mp3");
  bg2 = minim.loadFile("bg2.mp3");
  bg3 = minim.loadFile("bg3.mp3");
  bg4 = minim.loadFile("bg4.mp3");
}

void movieEvent(Movie m) {
  m.read();
}


void draw() {

  // for the intro video

  if ( mo == true) {
    image(mov, 0, 0);
  }

  if (mo && mov.time() >= mov.duration()) {
    mov.stop();
    mo = false;
    background(bg);

    // two main situations

    if (playSong)
    {
      playTheSong();
    } else
    {
      readTheInputs();
    }
  }
}

Without video related codes, It works perfectly.

However when I put video codes, I can't see the readTheInputs part.

I think I had a mistake on align this code..

Can you help me to fix this?

Problem drawing Text & Rect etc inside a oscEvent | oscP5 & Control P5

$
0
0

Hi there,

I've written a little OSC controller for millumin. All my GUI & Osc sending works as expected. All calls made directly to the button Handler work fine, and I can display Text, shapes, etc...

I'm also wanting to listen to touch OSC on port 5001, and once the osc message correspond to specific parametre it trigger a series of action. -> Trigger a button event (so i can reuse the same code as if i had clicked on the UI). -> Send an OSC message.

Now both ways of doing that , works, as far as getting a feedback on the console or sending an OSC message to the app.

My Problem is:

None of these method allow me to draw anything in the processing app. I wanting to display some Text saying what is the last button press. Calling the same button function / event handlers from the Osc event handlers, It Prints to console, send OSC, but won't draw anything.

Just wondering what am I doing wrong ? As i can verify that the whole If Statement gets executed, but none of the draw related commands works ( fill(); rect(); text(); background(); )

See Code Below:

` // Millumin OSC Controller // June 2016 - Olivier Jean. // Manual Trigger via Processing UI & also listen to port 5001 for OSC trigger ( Here Touch Osc )

// GUI LIBRARY's
import controlP5.*;

// OSC LIBRARY's
import netP5.*;
import oscP5.*;

// Instantiate P5
ControlP5 cp5;

// Instantiate OSC
OscP5 oscP5;
NetAddress myBroadcastLocation; // Used in the Millumin SDK example

// we're counting since when we been running. (Disable Premature Button Press :/ I think there's setBroadcast(false) ?
//long timeElapsed;
boolean activate = false;

// Background color variable.
public int myColor = color(25);
public int myBkg = color(25);

// GUI Variables
int spacerTop = 250;

// floats to receive OSC values.
float v_toggle1 = 0.0f;
float v_toggle2 = 0.0f;
float v_toggle3 = 0.0f;
float v_toggle4 = 0.0f;
float v_toggle5 = 0.0f;
float v_toggle6 = 0.0f;

String s  = "heyman";

////////////  SETUP  ////////////
void setup() {
  size(240, 700);
  noStroke();
  background(myBkg);

 int spacefromtop = 60;
cp5 = new ControlP5(this);

  // create a new button with name 'buttonA'
  cp5.addButton("Button_A")
     .setValue(1)
     .setPosition(20,20)
     .setSize(200,39)
     .setColorBackground(45)
     .setColorForeground(color(100,100,100))
     .setColorActive(color(255, 0, 0))
     .setColorValue(color(255, 255, 0))
     .setColorLabel(color(255, 255, 255))
     .setCaptionLabel("FADE TO BACK")
     ;

  cp5.addButton("Button_B")
     .setValue(1)
     .setPosition(20,spacefromtop*1+20)
     .setSize(200,39)
     .setCaptionLabel("PLAY COLUMN 2")
     ;

  cp5.addButton("Button_C")
     .setValue(0)
     .setPosition(20,spacefromtop*2+20)
     .setSize(200,39)
     .setCaptionLabel("PLAY COLUMN 3")
     ;

  cp5.addButton("Button_D")
     .setValue(0)
     .setPosition(20,spacefromtop*3+20)
     .setSize(200,39)
     .setCaptionLabel("PLAY COLUMN 4")
     ;

  cp5.addButton("Button_E")
     .setValue(0)
     .setPosition(20,spacefromtop*4+20)
     .setSize(200,39)
     .setCaptionLabel("PLAY COLUMN 5")
     ;

  cp5.addButton("Button_F")
     .setValue(0)
     .setPosition(20,spacefromtop*5+20)
     .setSize(200,39)
     .setCaptionLabel("PLAY COLUMN 6")
     ;

  // Based on the Millumin SDK example @ GitHub
  oscP5 = new OscP5(this, 5001); //
  myBroadcastLocation = new NetAddress("255.255.255.255", 5000); // change local host to multicast address

  DisplayStatusText("Tx [5000] & Rx [5001]");
}

////////////  XXXXX  ////////////


//////////// DRAW MAIN LOOP ////////////
void draw() {
  if (millis() > 5000) activate = true ; // we disable OSC for 5sec ( prevents P5 setup event from sending OSC messages )

  // println(millis());

}
//////////// XXXXXXXXXXXX ////////////


//////////// BUTTON EVENT HANDLERS ////////////

// for version of the code we built in the osc sending function into the button A handlers.
// for later version we would call the OSC Send function.

// function Button_A will receive changes from
// controller with name Button_A
public void Button_A(int theValue) {
  println("a button event from Button_A: "+theValue);
  println("Sending you some OSC Juice");
  if ( activate == true ) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(1);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' FADE TO BLACK '");
    myColor = color(30, 30, 30);
    println ("Did you get that sweet OSC Juice ?");
  }
}

public void Button_B(int theValue) {
  println("a button event from Button_B: "+theValue);
  if (activate == true) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(2);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' COLUMM 02 '");
    myColor = color(255, 0, 0);
  }
}

public void Button_C(int theValue) {
  println("a button event from Button_C: "+theValue);
  if (activate == true) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(3);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' COLUMM 03 '");
    myColor= color(255, 255, 0);
  }
}

public void Button_D(int theValue) {
  println("a button event from Button_D: "+theValue);
  if (activate == true) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(4);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' COLUMM 04 '");
    myColor= color(255, 255, 0);
  }
}

public void Button_E(int theValue) {
  println("a button event from Button_E: "+theValue);
  if (activate == true) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(5);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' COLUMM 05 '");
    myColor= color(255, 255, 0);
  }
}

public void Button_F(int theValue) {
  println("a button event from Button_F: "+theValue);
  if (activate == true) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(6);
    oscP5.send(myOscMessage, myBroadcastLocation);
    DisplayStatusText("OSC Message ' COLUMM 06 '");
    myColor= color(255, 255, 0);
  }
}


/////////// XXXXXXXXXXXXXXXX ////////////

/////////// OSC MESSAGE * IN * HANDLERS ////////////

public void oscEvent(OscMessage theOscMessage) {

  String addr = theOscMessage.addrPattern();
  float  val  = theOscMessage.get(0).floatValue();

  if ((addr.equals("/1/black")) && (val == 1.0f)) {
    Button_A(1);
  }
  if ((addr.equals("/1/playb")) && (val == 1.0f)) {
    v_toggle2 = val;
    Button_B(1);
  }
  if ((addr.equals("/1/playc")) && (val == 1.0f)) {
    v_toggle3 = val;
    DisplayStatusText("OSC Message ' COLUMM 03 '");
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(3);
    oscP5.send(myOscMessage, myBroadcastLocation);

  }
  if ((addr.equals("/1/playd")) && (val == 1.0f))  {
    DisplayStatusText("OSC Message ' COLUMM 04 '");
    v_toggle4 = val;
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(4);
    oscP5.send(myOscMessage, myBroadcastLocation);

  }
  if ((addr.equals("/1/playe")) && (val == 1.0f)) {
    v_toggle5 = val;
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(5);
    oscP5.send(myOscMessage, myBroadcastLocation);
    fill(myBkg);
    rect(10, spacerTop + 220, 220, 50);
    fill(255, 255, 255);
    text(" COLUMN 5 ", 20, spacerTop + 232);
    fill(myColor);
  }
  if ((addr.equals("/1/playf")) && (val == 1.0f)) {
    OscMessage myOscMessage = new OscMessage("/millumin/action/launchColumn");
    myOscMessage.add(6);
    oscP5.send(myOscMessage, myBroadcastLocation);
    background(123,123,157);
    fill(128);
    rect(10, 20, 220, 500);
    fill(255, 255, 255);
    text(" COLUMN 6 ", 0, 0);
    fill(25);
    println("testing handler - got it");

  }
}

//////////// XXXXXXXXXXXXXXXX ////////////

//////////// DISPLAY STATUS MESSAGE ////////////

public void DisplayStatusText(String theTextMessage) {
  fill(myBkg);
  rect(10, 470, 220, 50);
  fill(255, 255, 255);
  text(": " + theTextMessage, 20, 482);
  fill(myColor);
}

//////////// XXXXXXXXXXXXXXXX //////////// `

Thanks a lot ;)


How to respond with each point seperate in the spiral

$
0
0

Hello together, i am a student from Germany and my english isn´t the best. Sorry for that. I need your help for a project named visual music. I havent so much experience in processing, so i hope you can help me.

My code interprets data from cSound with Processing. The code works, but now i want to change the colour and size values of each generated point in my spiral. It should change them in addiction to the imported data ( int circle_color_r, circle_color_g, circle_color_b and float value). Here is my (Processing)code. Do you have a solution for me?

import netP5.*;
import oscP5.*;
OscP5 oscP5;
NetAddress myRemoteLocation;

int circle_color_r ;
int circle_color_g ;
int circle_color_b ;
float ang=0;
int mx,my;
float value;
float size;

void setup(){
  size(500,500);
  background(0);
  frameRate(100);
  mx=width/2;
  my=height/2;
  oscP5 = new OscP5(this,12000); //Portangabe
}

void draw(){
  strokeWeight(size); //stärke der Punkte
  rect(0,0,width,height);
  float ray=0.2;
  ang-=1.1; //Winkelgeschwindigkeit

  for(int i=0;i<650;i++){
    ray*=1.0022;
    ray+=0.2;
    float maxray = ray*1.1;//streuung
    float a = radians(ang+i);

    stroke(circle_color_r, circle_color_g, circle_color_b);
    point(mx+ cos(a)*random(ray,maxray),my+sin(a)*random(ray,maxray));
  }
   this.size = value*10;
   saveFrame();
 }
void oscEvent(OscMessage theOscMessage) {
   if(theOscMessage.checkAddrPattern("/freq")==true)
  {
   value = theOscMessage.get(0).floatValue();
  print(value);
  }
  if(theOscMessage.checkAddrPattern("/color")==true)
  {
   circle_color_r = theOscMessage.get(0).intValue();
   circle_color_g = theOscMessage.get(1).intValue();
   circle_color_b = theOscMessage.get(2).intValue();
print(circle_color_r);
  }
}

Multiple windows and openGl Obj renderning

$
0
0

Hello,

I am new in Processing.

I would like to show on two windows a 3D object. The orientation of the object in each windows is given by roll pitch yaw comming from serial port.

I succeed to launch a new window using secondApplet but don't know how to render and rotate the 3D object on this second window.

Here's the code I am using:

`` import processing.serial.*; import java.awt.datatransfer.*; import java.awt.Toolkit; import processing.opengl.*; import saito.objloader.*; import g4p_controls.*; import javax.swing.*; import javax.swing.JFrame;

float xx, yy, zz;
float xx2, yy2, zz2;
// import the Processing serial library
Serial myPort;                  // The serial port

float bgcolor;            // Background color
float fgcolor;            // Fill color
float xpos, ypos;            // Starting position of the ball
OBJModel model;
OBJModel model2;
// UI controls.
GPanel    configPanel;
GDropList serialList;
GLabel    serialLabel;
GLabel    calLabel;
GCheckbox printSerialCheckbox;

PFrame f;
secondApplet s;




public class PFrame extends JFrame {
  public PFrame() {
    setBounds(0, 0, 640, 480);
    s = new secondApplet();
    add(s);
    s.init();
    show();
  }
}


public class secondApplet extends PApplet {
    public void setup() {
     size(640, 480, OPENGL);
  fill(0);
  //framerate(30);

  // List all the available serial ports
  //println(Serial.list());
  }

  public void draw() {
    background(0);
  noStroke();
  translate(width/3, height/2);
  pushMatrix();
  // Simple 3 point lighting for dramatic effect.
  // Slightly red light in upper right, slightly blue light in upper left, and white light from behind.
  pointLight(255, 200, 200, 400, 400, 500);
  pointLight(200, 200, 255, -400, 400, 500);
  pointLight(255, 255, 255, 0, 0, -500);

  rotateX(-yy);//pitch
  rotateY(zz);//yaw
  rotateZ(-xx);//roll
  model2.draw();
  translate(2*width/3,height/2);
   popMatrix();
   }
}

void setup() {

  size(640, 480, OPENGL);
  fill(0);
  //framerate(30);

  model= new OBJModel(this,"Vagabond.obj");
  model2= new OBJModel(this,"Vagabond.obj");

  model.scale(40);
  model2.scale(40);

   PFrame p = new PFrame();
    myPort = new Serial(this, Serial.list()[0], 115200);
  // read bytes into a buffer until you get a linefeed (ASCII 10):
  myPort.bufferUntil('\n');
  //  smooth();
}

void draw() {
  background(0);
  noStroke();
  translate(width/3, height/2);
  pushMatrix();
  // Simple 3 point lighting for dramatic effect.
  // Slightly red light in upper right, slightly blue light in upper left, and white light from behind.
  pointLight(255, 200, 200, 400, 400, 500);
  pointLight(200, 200, 255, -400, 400, 500);
  pointLight(255, 255, 255, 0, 0, -500);

  rotateX(-yy);//pitch
  rotateY(zz);//yaw
  rotateZ(-xx);//roll
  model.draw();
  translate(2*width/3,height/2);
   popMatrix();
}


void serialEvent(Serial myPort) {
  // read the serial buffer:
  String myString = myPort.readStringUntil('\n');
  // if you got any bytes other than the linefeed:
  myString = trim(myString);

  // split the string at the commas
  // and convert the sections into integers:
  float sensors[] = float(split(myString, '\t'));
  println("roll: " + sensors[0] + " pitch: " + sensors[1] + " yaw: " + sensors[2] + "\t" + "roll: " + sensors[5] + " pitch: " + sensors[4] + " yaw: " + sensors[3] + "\t");
  // print out the values you got:
  for (int sensorNum = 0; sensorNum < sensors.length; sensorNum++) {
    //print("Sensor " + sensorNum + ": " + sensors[sensorNum] + "\t");
    if (sensorNum == 0) {
      xx = sensors[0];
      xx2= sensors[3];
    }
    if (sensorNum == 1) {
      yy = sensors[1];
      yy2 = sensors[4];
    }
    if (sensorNum == 2) {
      zz = sensors[2];
      zz2=sensors[5];

    }
  }

}``

Any help would be appreciate .

Can't use webcam with higher resolutions

$
0
0

Hello, processers!

here I have this peculiar behaviour of my webcam (or pc) when trying to have a higher resolution web cam feed with these lines:

Capture cam;
import processing.video.*;
void setup() {
  //size(1280,960);
  fullScreen();
  cam = new Capture(this, 1280, 960, "FSC WebCam 130");
  cam.start();
  printArray(Capture.list());
}

void draw() {
  if (cam.available()) {
    cam.read();
  }

  image(cam, 0, 0, 1280, 1024);
}

black screen or single still frame appears when program is executed, upon closing it gives me this error: "(java.exe:4800): GStreamer-CRITICAL **: Trying to dispose element Video Capture, but it is in PAUSED instead of the NULL state. You need to explicitly set elements to the NULL state before dropping the final reference, to allow them to clean up. This problem may also be caused by a refcounting bug in the application or some element."

webcam used: fujitsu siemens ( http://mobilespecs.net/webcam/Fujitsu-Siemens/Fujitsu-Siemens_Webcam_130_portable.html )

If someone would have an idea how to solve it it would help immensely! Cheers!

How use millis() in static methods?

$
0
0

Hello, I want to calculate the time within the Serialevent Methode.

static class SerialPortReader implements SerialPortEventListener {

    public void serialEvent(SerialPortEvent event) {
        startLoop = millis();
        ...
        ...
        long endLoop = millis();
        println("Looptime = "+(endLoop - startLoop));

    }// End of serialEvent
}// End of SerialPortReader

I'm not static friendly :)) ;)

Radio button labels in controlP5

$
0
0

Hi everyone!

I have been using Processing for a couple of weeks now and am trying to develop a GUI where I can get user input via checkboxes, radiobuttons, dropdowns, etc.

But I'm running into trouble with the radio buttons. Specifically, I'm using loadImages() in order to replace the simple box that is the default (replacing it with an image of a circle with or without a fill- see screen shot). This part works! But then the labels are not visible. I've tried showLabels(), but it doesn't do anything if I'm using it with loadImages(); works fine if I comment out the loadImages() part of it. Here is my code snippet:

//Note, drawInputsP1() gets called when a button is pressed on the previous page, it's not called in draw().

void drawInputsP1() { cp5_1 = new ControlP5(this); PImage rb[] = {loadImageIO("rb_default.tif"),loadImageIO("rb_over.tif"), loadImageIO("rb_active.tif")}; radio_button1 = cp5_1.addRadioButton("experience");
rb[0].resize(20,20); rb[1].resize(20,20); rb[2].resize(20,20); radio_button1.setPosition(width/2,height/2) .setSize(rb[0]) .setImages(rb[0],rb[1],rb[2]) .addItem("A",1) .addItem("B",2) .addItem("C",3) ; }

radio_buttons

Is there a way to make the labels of each option visible while still using my own images? If not, does anyone have other suggestions on how to implement a radio button that looks more like the screenshot above?

Thank you in advance!

PImage won't draw to screen (Involves Tables and Classes)

$
0
0

I'm trying to make a guitar hero/rock band like game where the player presses keys when the notes go over the hit boxes, but for some reason the images for the notes stopped displaying. I am currently loading the note data from a .csv file (x,y,speed,colour), and displaying the image based on that data.

Again the problem that I am having is that the images aren't displaying.

Code That Runs:

Song song1 = new Song(10000,"data\\songs\\test.csv"); void setup(){ size(1080,720); frameRate(30); audio = new PlayAudio(this); song1.load_notes(); } void draw(){ background(255); song1.draw_notes(); }

Note Code:

abstract class _Note{ float posx,posy,speed; String colour; PImage img1,img2,img3,img4; void loadImg(){ img1 = loadImage("graphics\\r_note1.png"); img2 = loadImage("graphics\\b_note1.png"); img3 = loadImage("graphics\\g_note1.png"); img4 = loadImage("graphics\\y_note1.png"); } void display(){ if(colour=="red")image(img1,posx,posy); if(colour=="blue")image(img2,posx,posy); if(colour=="green")image(img3,posx,posy); if(colour=="yellow")image(img4,posx,posy); } } class Note extends _Note{ Note(float x, float y, float s, String c){ posx = x; posy = y; speed = s; colour = c; } }

Song Code:

abstract class _Song{ Table note_table; String file; ArrayList<Note>note = new ArrayList<Note>(); int seconds,time; void load_notes(){ note_table = loadTable(file,"header"); for(int i=0;i<note_table.getRowCount();i++){ note.add(new Note(note_table.getFloat(i,0),note_table.getFloat(i,1),note_table.getInt(i,2),note_table.getString(i,3))); } for(int i=0;i<note.size();i++){ note.get(i).loadImg(); } println("|--X--|--Y--|Speed|Colour|"); for(int i=0;i<note.size();i++){ println(note.get(i).posx+"|"+note.get(i).posy+"|"+note.get(i).speed+"|"+note.get(i).colour); } } void draw_notes(){ for(int i=0;i<note.size();i++){ note.get(i).display(); //note.get(i).posx = note.get(i).posx+note.get(i).speed; } } } class Song extends _Song{ Song(int s,String f){ seconds = s; file = f; } }

CSV File

--x--,-y-,speed,colour, 100,100,5,red 200,200,5,blue 300,300,5,green 400,400,5,yellow

[Basic]opencv processing + simpleopenNI

$
0
0

Hi forum, I am trying to use the new openCV for processing library.. Seems great! But all examples uses testImages.. I want to use simple OpenNI (kinect) with this library.. Can some help me with this?

A basic example in simpleOPENNI looks like:

SimpleOpenNI context;
void setup()
{
size(640*2, 480);
context = new SimpleOpenNI(this);
context.enableDepth();
}
void draw()
{
// update the cam
context.update();
image(context.depthImage(), 0, 0);
}

Problem capturing cam with external USB videocard

$
0
0

Good afternoon, I've a problem: I need to capture video from an external camera for my ROV. I've a camera (composite out) and an USB videocard, both working properly (I've checked them). I'm trying to use them in Processing with the following lines:

import processing.video.*;
Capture cam;
void setup() {
  size(640, 480);
  String[] cameras = Capture.list();
  if (cameras.length == 0) {
    println("There are no cameras available for capture.");
    exit();
  } else {
    println("Available cameras:");
    for (int i = 0; i < cameras.length; i++) {
      println(cameras[i]);
    }
    // The camera can be initialized directly using an
    // element from the array returned by list():
    cam = new Capture(this, cameras[18]);
    cam.start();
  }
}
void draw() {
  if (cam.available() == true) {
    cam.read();
  }
  image(cam, 0, 0);
  // The following does the same, and is faster when just drawing the image
  // without any additional resizing, transformations, or tint.
  //set(0, 0, cam);
}

USB videocard is correctely listed (number 18) but, once started, Processing returns only gray screen and the following errors... how can I solve those problems??? The number code of the error changes everytime but the text of the error is always the same as follows... Thanks in advance, Andrew


**(java.exe:1796): GStreamer-CRITICAL **: Trying to dispose element rgb, but it is in PAUSED instead of the NULL state. You need to explicitly set elements to the NULL state before dropping the final reference, to allow them to clean up. This problem may also be caused by a refcounting bug in the application or some element.

(java.exe:1796): GStreamer-CRITICAL **: Trying to dispose element Video Capture, but it is in PAUSED instead of the NULL state. You need to explicitly set elements to the NULL state before dropping the final reference, to allow them to clean up. This problem may also be caused by a refcounting bug in the application or some element.**


Does unfolding works with P3?

$
0
0

It does not show up as compatible from within P3. But when I went to Github I noticed that it seems that it has already been updated for P3. But when I downloaded the package from Github and manually extracted it and put it in the libraries folder. It still does not show up in P3 IDE. Has anybody had any success in making it work with P3?

Thanks, Frank

WebP load method?

$
0
0

Title says it all; is there a lib that ie adds a method to loadImage() to load webp images or something like that? ;;)

High Resolution Output for P3D

$
0
0

I'm using the following code snippet (from https://amnonp5.wordpress.com/2012/01/28/25-life-saving-tips-for-processing/ - number 16) to export a scaled-up high resolution file. Everything works perfectly as long as size() and createGraphics() use the 2D renderer:

void setup() {
  size(500, 500); // IMPLICIT USE OF 2D RENDERER
}

void draw() {
  background(255);
  smooth();
  strokeWeight(10);
  fill(255, 0, 0);
  ellipse(width/2, height/2, 200, 200);
}

void keyPressed() {
  if (key == 's') {
    save("normal.png");
    saveHiRes(5);
    exit();
  }
}

void saveHiRes(int scaleFactor) {
  PGraphics hires = createGraphics(width*scaleFactor, height*scaleFactor, JAVA2D); // CREATION OF 2D PGRAPHICS OBJECT
  beginRecord(hires);
  hires.scale(scaleFactor);
  draw();
  endRecord();
  hires.save("hires.png");
}

However, once the two bold lines use P3D instead (code at the end of this post), the program results in two problems: 1. The following error message appears: "OpenGL error 1282 at top endDraw(): invalid operation" (on Win7 with updated drivers) 2. Scaling doesn't actually happen - instead, unscaled image sits in the left bottom of the hi-res PGraphics object - like this: _hires_800px_test

The correct result should look like this: normal

After researching this, I couldn't find any high-resolution output for people working with a 3D renderer. I don't care so much about how to create hi-res output, and tried several approaches - without luck yet.

Any help would be greatly appreciated!

Here's the 3D version of the initial code snippet: void setup() { size(500, 500, P3D); // 3D RENDERER }

void draw() {
  background(255);
  smooth();
  strokeWeight(10);
  fill(255, 0, 0);
  ellipse(width/2, height/2, 200, 200);
}

void keyPressed() {
  if (key == 's') {
    save("normal.png");
    saveHiRes(5);
    exit();
  }
}

void saveHiRes(int scaleFactor) {
  PGraphics hires = createGraphics(width*scaleFactor, height*scaleFactor, P3D); // CREATION OF 3D PGRAPHICS OBJECT
  beginRecord(hires);
  hires.scale(scaleFactor);
  draw();
  endRecord();
  hires.save("hires.png");
}

Livestreaming: IPCapture (sdp file)

$
0
0

Basically, what I wanted to do is perform a livestreaming with IPCapture function in Processing. I have an external camera attached on a drone. My pc is connected to the controller's wifi. This is what i do normally (streaming). I open a cmd and nc to ""10.1.1.1 5502". After that, I will open the sdp file with vlc. It works perfectly. Instead of using these methods, I thinking using Processing to stream my video.

This is the sdp file code:

c=IN IP4 10.1.1.1

m=video 5600 RTP/AVP 96

a=rtpmap:96 H264/90000

t=0 0

This is my IPCapture code: import ipcapture.*;

IPCapture cam;

void setup() {
  size(720,520);
  cam = new IPCapture(this, "http://" + "10.1.1.1:5502/", "root", "admin");
  cam.start();

}

void draw() {
  if (cam.isAvailable()) {
    cam.read();
    image(cam,0,0);
  }
}

void keyPressed() {
  if (key == ' ') {
    if (cam.isAlive()) cam.stop();
    else cam.start();
  }
}

I was able to run the code, but it only show a grey empty screen with no video. What's wrong with my code, in the console no error message. How to link the sdp file with processing? Thx.

Viewing all 2896 articles
Browse latest View live