Using delay () in a processing environment

I use a processing language to draw a rectangle that grows with time. The following code gives no output.

void setup() { size(900,900); } void draw() { int edge=100; for(int i=0;i<300;i++) { delay(100); edge++; rect(100,100,edge,edge); } } 

I suspect that I used the delay () function incorrectly.

+6
source share
3 answers

I recommend flipping your own delay system using the millis () function.

Check out this example .

+7
source

Here is one such method of "folding your own," which is suitable for most purposes. Just change the values โ€‹โ€‹passed to the delay method to change the time. It just prints "start" and "end" about every 2 seconds, for example.

 void draw() { System.out.println("start"); delay(2000); System.out.println("end"); delay(2000); } void delay(int delay) { int time = millis(); while(millis() - time <= delay); } 
+10
source

During processing, the screen is not updated until the program flow reaches the end of the draw ()
Try the following:

  void setup() { size(900,900); frameRate(10); } int edge = 100; void draw() { edge++; rect(100,100,edge,edge); } 
+2
source

Source: https://habr.com/ru/post/949399/


All Articles