airsun-viewing04

  • Speculation: Facing the disputed value of digital, and the decaying matter, it actively plays with the digital detritus to investigate something new in the currency of the 21st century.
  • Spectacle: Seeing the decaying matter, it hates digital entropy and tries to escape it by building stable ground for value and profit.
  • "Text Rain" by Camille Utterback, in my opinion, is an example of Spectacle. I heard about this work when I was first introduced to Interactive Art. In my impression, "Text Rain" was considered as a representative of this stream of art. Therefore, I think it fulfills this idea of breaking through the stable ground which exists before it. In Warburton's term at 9'08'', I feel like this work lies nearer towards, acceleration, visibility, art, function. For surplus and waste, I am not really sure about its position.

airsun-LookingOutwards03

"Dances of the Sacred & Profane"(2014) done by Camille Utterback is an hour-long collaborative dance work. It has a motion capture projection which utilizes a real-time particle physics system. The dancers' movements will be captured by the cameras and then projected to panel screens at the background. However, the relationship is not linear in which the pattern projected on the screen does not only depict dancers' movements but also other components such as moving particles interacting with each other. The whole aesthetics here is inspired by the music and art of the Impressionist period. The artist is interested in observing and understanding how images offer a visceral connection between the real and the virtual.

For me, what attracts me the most about this piece is its novel way of interacting with dancers' bodies with technology. Interactions with dancers are familiar and we have seen many performances before, for example, seeing dancers interacting with the music or seeing them having charcoal on them and dance to draw patterns. However, this project took the approach that elevates these interactions into a new place with the invitation of technology. The dances combined the uncertainty of human actions with the imagination of art. It enhanced this concept of using our body to draw. It also visualizes the human interactions in various ways.

airsun-Reading

Looking back on my past artworks and the concepts for each of them, I see myself more towards the last word art. That is, I never spend so much time on discovering and analyzing the medium. Instead, I am a user of the medium that has been created, and I recreate things upon them. One major focus of my art practice is to "redesign" things. I like to redesign and remake objects that are overlooked in our daily interactions to bring a new level of novelty and meaning. For example, one of my previous works involves designing a travel guide brochure and making the destination in the brochure a fictional place. Moreover, I redesigned labels on t-shirts to create a narration from the product. For all these practices, I do not actively recreate the medium, I follow the rule that has been created, following the instructions in Illustrator and Photoshop, and following the conventional way of thread interacting with the cloth.

From what I have experienced, I believed that there is a double direction between the development of new technologies and our culture. On one hand, our interactions with each other and our behaviors have been modified and distorted through new technologies. For example, communication as a vital part of forming culture has been changed through the development of phones and the internet. We communicate more online instead of face-to-face. We also abandoned many rules in face-to-face communication and moved on to virtual communication. On the other hand, because of the changes in our behaviors, we have more needs in changing the way we live. And this is why many technologies were developed. This relationship is like an ongoing loop. Art, in comparison, also shared this type of relationship with technology. However, I feel like no matter how unstable the technology may become during the time, the concept is the one that should be preserved and valued. I am a person valued less towards the medium. Unless there is a very specific way of doing the work that drives me to interact with novel technology, I won't risk to let my ideas travel with it.

airsun-clock

When the next minute arrives, return to the top screen where hour, minute, weekday, and date will be displayed
Hour and minute are also displayed on the top left of the screen
The battery on the top right will decrease as time goes

First sketch on illustrator

For this project, I was inspired by one of my most used action, swiping messages on my phone. Because I interacted with it a lot, not only the time displayed on the top centre of the screen reminds me the time, the time marked on each messages, and the changes of the remaining battery all indicates the pass of time. Therefore, I wanted to create a representation of my interaction with time. In the work, the hand swiping the messages represents second. When a new minute comes, the screen will go to the starting mode with hour, minute, date, and weekday on it. Moreover, the abstract feed at the background emphasize my continuous interaction with my phone (walking while swiping). This time, I reflected on the problem of how my drawing may not go well with the vector drawing in p5js,. Therefore, I first implemented my drawing (demonstration shown below), and then changed to vector drawing for consistency.

experimenting with drawing and movements
// Clair Sijing Sun
// Clock Project
// 60-212
// Sep 20, 2018
 
var prevSec;
var millisRolloverTime;
var frames = [];
var secondH ;
var counter = 0;
var weekdayNames = ["Sunday", "Monday", "Tuesday", "Wednesday", 
  "Thursday", "Friday", "Saturday"]
var monthNames = ['January', 'February', 'March', 'April', 'May', 'June', 'July', 'August', 'September', 'October', 'November', 'December'];
var listofApps = ['WEIBO', 'WECHAT', 'GROUPME', 'GMAIL', 'LINKEDIN', 'FACEBOOK', 'STUDENT', 'SOUNDCLOUD'];
var appL ;
var randomN ;
var batteryR ;
var batteryG ;
var batteryB ;
var batteryL = 25;
var backrockX = 50;
var backrockY = 0;
var barH = 72;
var handM1 = 383;
var handM2 = 369;
var finger = [];
var skin = [];
var feetH ;
var feetH2 ;
 
 
 
//--------------------------
function setup() {
  createCanvas(600, 600);
  millisRolloverTime = 0;
  feetH = height;
  feetH2 = height + 100;
 
}
 
//--------------------------
function preload(){
  frames.push(loadImage("https://i.imgur.com/6LcHHng.png"));
  //frames.push(loadImage("https://i.imgur.com/od1uGbs.png"));
  frames.push(loadImage("https://i.imgur.com/gHs88Af.png"));
  finger.push(loadImage("https://i.imgur.com/QfwFtyL.png"));
  finger.push(loadImage("https://i.imgur.com/WFPnTt3.png"));
  skin.push(loadImage("https://i.imgur.com/vXlzYE7.png"));
  skin.push(loadImage("https://i.imgur.com/KFbFtYv.png"));
}
 
//--------------------------
function draw() {
  background(144, 173, 189); 
 
  push();
  fill(0, 10);
  if (backrockY < height || backrockY == height){
    backrockY = backrockY + 1;
  }else{backrockY = 0}
  for (i = 0; i < 20; i++){ ellipse(backrockX * i, backrockY+i*10, 4, 120); } pop(); appL = listofApps.length; push(); fill(232,158,78); rect(190,40,210,408, 20); pop(); // Fetch the current time var H = hour(); var Min = minute(); var S = second(); var D = day(); var M = month(); var week = new Date().getDay(); // Reckon the current millisecond, // particularly if the second has rolled over. // Note that this is more correct than using millis()%1000; if (prevSec != S) { millisRolloverTime = millis(); } prevSec = S; var mils = floor(millis() - millisRolloverTime); noStroke(); fill('black'); var currTimeString = "Time: " + (H%12) + ":" + nf(M,2) + ":" + nf(S,2) + ((H>12) ? "pm":"am");
 
 
  var hourBarWidth   = map(H, 0, 23, 0, width);
  var minuteBarWidth = map(Min, 0, 59, 0, width);
  var secondBarWidth = map(S, 0, 59, 0, width);
  var secondsWithFraction = S + (mils / 1000.0);
  var secondsWithNoFraction = S;
  var secondBarHeightChunky = map(secondsWithNoFraction, 0, 60, 0, width);
 
  //for displaying the moving hand
  if (mils < 500){
    counter = 0;
  }else{
    counter = 1;
  }
 
  //draw the feet
  push();
  fill(0);
  strokeWeight(4);
  stroke(0);
  if (mils < 500){
    feetH = map(mils, 0, 500, height+100, height);
    feetH2= map(mils, 0, 500, height, height+100);
  } else {
    feetH = map(mils, 500, 1000, height, height+100);
    feetH2 = map(mils, 500, 1000, height+100, height);
  }
  ellipse(width/4, feetH, 140, 200);
  ellipse(width/4+300, feetH2, 140, 200);
  fill(255, 100);
  arc(width/4, feetH-50, 100, 90, PI, 2*PI, CHORD);
  pop();
 
  //draw the top bar
  push();
  fill(255, 100);
  rect(185, 45, 220, 25, 20);
  pop();
 
  //controlling the movement of seconds
  var secondBarHeightSmooth = height;
  if (mils < 300) { secondBarHeightSmooth = map(mils, 0, 300, 420, 228); }else if (mils >= 300 && mils < 700){
    secondBarHeightSmooth = 228;
  }else{secondBarHeightSmooth = map(mils, 700, 1000, 228, 39);
  }
 
    //changing background for seconds
    push();
    textSize(20);
    fill(255, 150);
    if ((''+S).length == 1){
        S = '0'+ S;
    }
    rect(width/3, secondBarHeightSmooth-25, 190, 40, 10);
    fill(232,158,7);
    text(S, 350, secondBarHeightSmooth);  
    textSize(12);
    //if (S){
        //randomN = random(S%8);
    //}
    text(listofApps[int(S%8)], 220, secondBarHeightSmooth-12);
    fill(0)
    text('Notification', 220, secondBarHeightSmooth+5);
 
    pop();
 
 
    fill(255);
    barH = map(S, 0, 60, 72, 392);
    rect(393, barH, 2, 20, 5);
 
    //display of hour and minutes at the beginning 
    if (S < 1){
        if (mils < 700){
            secondH = 126;
        }else { secondH = map (mils, 700, 1000, 126, -65);}
    }else{
        secondH = -10;
    }
 
    push();
    textStyle(NORMAL);
    textSize(40);
    fill(255);
    textFont('Helvetica');
    if ((''+Min).length == 1){
        Min = '0'+ Min;
    }
    text(H + ":" + Min, width/2.5, secondH);
    textSize(12);
    text(weekdayNames[week]+","+monthNames[M-1]+ " " + D, 224, secondH+(145-126));
 
    pop();
 
    //display of mins and hours on top left
    push();
    fill(255);
    text(H+ ":" + Min, 201, 62);
    pop();
 
    //home bar
    fill(255);
    rect(240, height-170, 100, 5, 10);
 
 
 
  //image(frames[1], 80, -20, 500, 700);
  //image(frames[0], 80, -20, 500, 700);
 
  push();
  noFill();
  stroke(0);
  strokeWeight(9);
  rect(185, 42, 218, 405, 25);
  beginShape();
  strokeWeight(1);
  fill(0);
  curveVertex(245, 47);
  curveVertex(245, 47);
  curveVertex(261, 61);
  curveVertex(329, 61);
  curveVertex(345, 47);
  curveVertex(345, 47);
  endShape();
  pop();
  image(skin[counter],100, -18, 430, 655);
  image(finger[counter], 200, 220, 300, 300);
  //image(finger[counter], 200, 220, 300, 300);
 
  push();
  noFill();
  beginShape();
  stroke(0);
  strokeWeight(4);
  print(mouseX, mouseY);
  curveVertex(180, 245);
  curveVertex(180, 245);
  curveVertex(159, 256);
  curveVertex(152, 270);
  curveVertex(161, 278);
  curveVertex(180, 281);
  curveVertex(180, 281);
  endShape();
 
  beginShape();
  curveVertex(180, 299);
  curveVertex(180, 299);
  curveVertex(161, 308);
  curveVertex(155, 322);
  curveVertex(160, 333);
  curveVertex(180, 334);
  curveVertex(180, 334);
  endShape();
 
  beginShape();
  curveVertex(180, 365);
  curveVertex(180, 365);
  curveVertex(165, 374);
  curveVertex(157, 388);
  curveVertex(164, 398);
  curveVertex(180, 398);
  curveVertex(180, 398);
  endShape();
 
  beginShape();
  curveVertex(408, 243);
  curveVertex(408, 243);
  curveVertex(426, 272);
  curveVertex(435, 292);
  curveVertex(446, 309);
  curveVertex(457, 322);
  curveVertex(457, 322);
  curveVertex(474, 348);
  curveVertex(480, 365);
  curveVertex(482, 390);
  curveVertex(482, 410);
  curveVertex(482, 432);
  curveVertex(482, 450);
  curveVertex(488, 495);
  curveVertex(496, 526);
  curveVertex(501, 545);
  curveVertex(505, 562);
  curveVertex(518, 605);
  curveVertex(518, 605);
  endShape();
  if (counter == 1 ){
    handM1 = 383;
    handM2 = 360;
 
  }else{
    handM1 = 383;
    handM2 = 369;
 
  }
 
  beginShape();
  curveVertex(433, 326);
  curveVertex(433, 326);
  curveVertex(407, 340);
  curveVertex(393, 354);
  curveVertex(handM1, handM2);
  curveVertex(handM1, handM2);
  endShape();
 
  beginShape();
  curveVertex(285, 453);
  curveVertex(285, 453);
  curveVertex(286, 461);
  curveVertex(296, 469);
  curveVertex(308, 466);
  curveVertex(300, 476);
  curveVertex(301, 490);
  curveVertex(309, 501);
  curveVertex(318, 504);
  curveVertex(307, 508);
  curveVertex(309, 528);
  curveVertex(315, 544);
  curveVertex(332, 570);
  curveVertex(359, 600);
  curveVertex(359, 600);
  //curveVertex(handM1, handM2);
  endShape();
 
 
 
  pop();
 
  //battery drawing
  if (Min < 40){
    batteryR = 255;
    batteryG = 255;
    batteryB = 255;
  } else {
    batteryR = 213;
    batteryG = 73;
    batteryB = 61;
  }
 
  batteryL = (60-Min)/60 * 25;
  push();
  fill(batteryR,batteryG,batteryB);
  rect(width/2+57, 53, batteryL, 12, 4);
  noFill();
  stroke(255);
  rect(width/2+57, 53, 25, 12, 3);
 
 
 
  pop();
 
 
  push();
  fill(144, 173, 189);
  rect(width/4,0, 400, 37);
  pop();
 
 
 
}

The GIF:

airsun-LookingOutwards02

Reverb is done by Madeline Gannon. It is a context-aware 3D modeling environment that lets users design ready-to-print wearables around their own body. One thing I admire the most about this project is the dynamic it creates, both in the transformation from digital design to physical products and the evolvement from the living squid, producing random activity, to patterns that are aesthetically pleasing. Because the work is all generated from a virtual squid, it leaves this complexity of dynamic that the creator can play with the squid around the neck of a person, and leave this trace in various ways. The modeling interface uses a three-phase workflow, from 3D scanning to 3D modeling to 3D printing. First of all, it starts with the scanning phase, where it imports the physical context into the virtual environment. Then, the physical space is translated to a three-dimensional point cloud. The modeling phase will create the expressive digital form through different gestures in the scanned context. The printing phase will translate this digital geometry into physical matter.

Looking at the collar studies, different versions with their elegant curves and intersections all show Madeline Gannon's artistic sensibilities. It also involves effective complexity, combining both elements of order from the regularity and logic in the complex geometries around the 3D scanned context and elements of disorder from the user interaction and manipulation of the virtual squid.

 

airsun-Reading02

Question 1A. Be sure to have read the first 20 pages of "Generative Art Theory" by Philip Galanter (p.146-166). In your own words, and in just a few sentences, discuss an example of something you like which exhibits effective complexity. (It doesn't have to be something made by people.) Where does your selection sit between total order (e.g. crystal lattice) and total randomness (e.g. gas molecules, white noise, static)? Include an image which illustrates your selected thing.

My understanding for a project being effectively complex is: 1. having a combination of both ordered and disordered systems 2. the whole concept do not only work in one direction linearly but in a random direction and cannot be easily predicted. Therefore, my first thought is to find works that incorporate living objects (because they are less likely to be controlled). For example, a coded ecosystem where things like animals and human beings can act like the unpredictable randomness. However, on my way of finding this kind of project, I encountered "inFORM". "inFORM" is a Dynamic Shape Display that can render 3D content physically, so users can interact with digital information in a tangible way. On one hand, this project generates precisely and predictably when displaying three-dimensional math functions: e.g. plotting hyperbolic paraboloids with the combination of cubes. However, on the other hand, this project also incorporates psychological and physiological elements to increase its complexity. First of all, it reacts to physical human actions. For example, it can replicate actions to interact with a red ball. Moreover, it motivates people to do in ways that hint them to do certain actions. For instance, the cubes will form a changing wave to make the participant's phone constantly moving when someone is calling. In this sense, because the work corresponds with these sort of actions, it incorporates a large extent of randomness when our actions became random and unpredictable. In this sense, I found "inFORM" to be an example exhibiting effective complexity. Moreover, it sits closer to the "total order" end.

Question 1B. Quickly skim the remaining 10 pages of Galanter's article, in which he outlines nine different problems with generative art (The Problem of Authorship; The Problem of Intent; The Problem of Uniqueness; The Problem of Authenticity; The Problem of Dynamics; The Problem of Postmodernity; The Problem of Locality, Code, and Malleability; The Problem of Creativity; The Problem of Meaning). Select one of these problems for which you yourself feel some sort of internal conflict or personal stake. Discuss your internal conflict. Which side of the argument do you come down on?

The Relationship between Works of Art and the Audience - in Response to the Change of Aura
Having read Walter Ben "The Work of Art in the Age of Mechanical Reproduction", I remember figuring out how the development of technology has evoked reproduction of artworks with the medium such as films and photographs. These inventions had rendered and manipulated the aura of artwork and its relationship with the audience. With mass production, not only the reproduced artwork can never fully present the original work with its unique aura, the original work also loses a sense of authenticity and authority by being depreciated. However, is it the same for generative art? For generative art, what is being mass-produced are unique objects instead of copies. Therefore, the idea of aura and the relationship between the audience and the artwork remains ambiguous to me. What generative art brings us is so different from the "traditional" painting and sculptures. Not only the context changed (our understanding and familiarity with technology and our acceptance of what counts as art ), the way we interact with the artwork also changed. For example, we cannot touch a painting in the Le Louvre (or sometimes even taking a photo), however, for generative art, there are so many that welcomes the user to interact, by either pushing a key on your keyboard to generate work or to do some real actions.

airsun-AnimatedLoop

 

Overall, I am happy that this work turned out to be something similar to what I have imagined when doing the sketches. The choice of color and the overall style goes well with the theme. The idea here is about having an elevator going up and down as the eyeball of the eye. However, I do wish to spend more time on this project to increase the complexity of the gif. Currently, I feel there is something missing in the background. For more improvements, I can add more features to identify the elevator in a clearer way. For example, adding the button that controls the elevator going "up" and "down".

Brainstorm and sketch before starting the project (majority of the sketch was done in ps)

//Name: AirSun
//Sijings
//60-212
//code modified from Code Template for Looping GIFS
 
//===================================================
var myNickname = "airsun";
var nFramesInLoop = 120;
var bEnableExport = true;
var nElapsedFrames;
var bRecording;
var theCanvas;
var frames = [];
 
//===================================================
function setup() {
  theCanvas = createCanvas(640, 640);
  bRecording = false;
  nElapsedFrames = 0;
}
 
//===================================================
function keyTyped() {
  if (bEnableExport) {
    if ((key === 'f') || (key === 'F')) {
      bRecording = true;
      nElapsedFrames = 0;
    }
  }
}
 
//===================================================
function preload(){
  var filenames = [];
  filenames[0] = "https://i.imgur.com/4SUU0dP.png";
  filenames[1] = "https://i.imgur.com/ZkRe8eA.png";
  filenames[2] = "https://i.imgur.com/htCZu9X.png";
  filenames[3] = "https://i.imgur.com/oymFCPW.png";
  filenames[4] = "https://i.imgur.com/ahNh0P1.png";
  for (i=0; i<filenames.length; i++){ frames.push(loadImage(filenames[i])); } } //=================================================== function draw() { var percentCompleteFraction = 0; if (bRecording) { percentCompleteFraction = float(nElapsedFrames) / float(nFramesInLoop); } else { percentCompleteFraction = float(frameCount % nFramesInLoop) / float(nFramesInLoop); } renderMyDesign (percentCompleteFraction); if (bRecording && bEnableExport) { var frameOutputFilename = myNickname + "_frame_" + nf(nElapsedFrames, 4) + ".png"; print("Saving output image: " + frameOutputFilename); saveCanvas(theCanvas, frameOutputFilename, 'png'); nElapsedFrames++; if (nElapsedFrames >= nFramesInLoop) {
      bRecording = false;
    }
  }
}
 
 
//===================================================
function renderMyDesign (percent) {
  background(216,195,131);
  smooth();
  stroke(0, 0, 0);
  strokeWeight(2);
 
  //----------------------
  // Here, I assign some handy variables. 
  var cx = 100;
  var cy = 100;
 
  //----------------------
  // Here, I use trigonometry to render a rotating element.
  var radius = 80;
  var rotatingArmAngle = percent * TWO_PI;
  var px = cx + radius * cos(rotatingArmAngle);
  var py = cy + radius * sin(rotatingArmAngle);
 
 
 
  //----------------------
  // Here's a linearly-moving square
  var squareSize = 20;
  var topY = height/3 - squareSize - 2;
  var botY = height-height/3 + 2;
  var eyeL = width/10;
  var eyeR = width-width/10;
  var eyeW = 200;
 
  strokeWeight(2);
  fill(248,242,231);
  beginShape();
  noStroke();
  //upper part of the eye
  curveVertex(eyeR, topY+(botY-topY)/2);
  curveVertex(eyeR, topY+(botY-topY)/2);
  curveVertex(eyeL+(eyeR-eyeL)*2/3, topY);
  curveVertex(eyeL+(eyeR-eyeL)/3, topY);
  curveVertex(eyeL, topY+(botY-topY)/2);
  curveVertex(eyeL, topY+(botY-topY)/2);
  //lower part of the eye
  endShape();
 
  beginShape();
  curveVertex(eyeR, topY+(botY-topY)/2);
  curveVertex(eyeR, topY+(botY-topY)/2);
  curveVertex(eyeL+(eyeR-eyeL)*2/3, botY);
  curveVertex(eyeL+(eyeR-eyeL)/3, botY);
  curveVertex(eyeL, topY+(botY-topY)/2);
  curveVertex(eyeL, topY+(botY-topY)/2);
  endShape();
 
 
 
  //other facial features
  noFill();
  stroke(90,84,68);
  strokeWeight(4);
  arc(eyeL+5, topY+(botY-topY)/2, 50, 50, 7/4*PI, 1/4*PI);
 
  beginShape();
  curveVertex(100, height);
  curveVertex(170, height);
  curveVertex(60, height/2+200);
 
  curveVertex(21, height/2);
  curveVertex(65, 91);
  curveVertex(65, 91);
  endShape();
  fill(90,84,68);
  ellipse(70, height, 80, 40);
 
 
 
  //eyeballs' drawing
  var eased = DoublePolynomialSigmoid (percent, 1); 
  if (percent < 0.5) {
  //eased = (eased + 0.5)%2; // shifted by a half-loop, for fun
    var yPosition2 = map(eased, 0, 1, topY-150, botY-100);
  //print(yPosition2, botY-200)
  }else{
    var yPosition2 = map(eased, 0, 1, botY-100, topY-150);
  }
 
  fill (165, 73, 59); 
  //ellipse (eyeL+(eyeR-eyeL)/2, yPosition2, eyeW, eyeW); 
  var currentY=eyeL+(eyeR-eyeL)/5-70;
  var currenyX=yPosition2-150;
  var numofF=0;
  if (frameCount % 5 == 0){
    numofF +=1;
  }
  var framesC = numofF % 5;
 
  image(frames[framesC], currentY, currenyX, 430, 700);
 
 
 
  push();
  fill(216,195,131);
  noStroke();
  rect(width/3, height-180, 400, 200);
  pop();
 
  noFill();
  stroke(216,195,131);
  strokeWeight(200);
  beginShape();
  //upper part of the eye
  curveVertex(eyeR, topY+(botY-topY)/2-125);
  curveVertex(eyeR, topY+(botY-topY)/2-140);
  curveVertex(eyeL+(eyeR-eyeL)*2/3, topY-105);
  curveVertex(eyeL+(eyeR-eyeL)/3, topY-105);
  curveVertex(eyeL, topY+(botY-topY)/2-140);
  curveVertex(eyeL, topY+(botY-topY)/2-125);
  //lower part of the eye
  endShape();
  strokeWeight(70);
  beginShape();
  curveVertex(eyeR, topY+(botY-topY)/2+50);
  curveVertex(eyeR, topY+(botY-topY)/2+50);
  curveVertex(eyeL+(eyeR-eyeL)*2/3, botY+40);
  curveVertex(eyeL+(eyeR-eyeL)/3, botY+40);
  curveVertex(eyeL, topY+(botY-topY)/2+50);
  curveVertex(eyeL, topY+(botY-topY)/2+50);
  endShape();
 
  //eyeLash
  var eyeLashAngel = 45;
  var centerx = width / 2;
  var centery = botY-90;
  var radius = 240;
  stroke(90,84,68);
  strokeWeight(4);
  for (i = 0; i < 10; i++){
    var x = cos(radians(eyeLashAngel)) * radius;
    var y = sin(radians(eyeLashAngel)) * radius;
    var increasement;
    if (i < 5) {
      increasement = i * 15;
    }else{
      increasement = (10 - (i+1)) * 15;}
    line(centerx + x/1.2 , centery - y/1.2, centerx + 1.2*x , centery - y - increasement);
    eyeLashAngel = eyeLashAngel+10;
  }
  centery = topY + 90;
  eyeLashAngel = 225;
  for (i = 0; i < 10; i++){
    var x = cos(radians(eyeLashAngel)) * radius;
    var y = sin(radians(eyeLashAngel)) * radius;
    var increasement;
    if (i < 5) {
      increasement = i * 15;
    }else{
      increasement = (10 - (i+1)) * 15;}
    line(centerx + x/1.2 , centery - y/1.2, centerx + 1.2*x, centery - y + increasement);
    eyeLashAngel = eyeLashAngel+10;
  }
}
 
//following code got from https://github.com/golanlevin/Pattern_Master
//------------------------------------------------------------------
function DoublePolynomialSigmoid (_x, _a){
  var n = _a;
  var _y = 0;
  if (n%2 == 0){ 
    // even polynomial
    if (_x<=0.5){
      _y = pow(2.0*_x, n)/2.0;
    } 
    else {
      _y = 1.0 - pow(2*(_x-1.0), n)/2.0;
    }
  } 
 
  else { 
    // odd polynomial
    if (_x<=0.5){
      _y = pow(2.0*_x, n)/2.0;
    } 
    else {
      _y = 1.0 + pow(2.0*(_x-1.0), n)/2.0;
    }
 
  }
 
  return (_y);
}

airsun-Scope

For this project, I was new to Java, so I first spent a while trying to learn the syntax. After finding out the syntax shared many similarities with Javascript, I started by experimenting with different ideas and see how it will turn out visually. The idea of the scope is to demonstrate four fingers of a hand, facing downwards, and scratching a surface. The index is the moving and looping object and it will always leave a scratch mark.

Preparatory hand-drawn sketches of my design:

The PNG file:

The loop gif:

// Template for KidzLabs/4M/Toysmith Animation Praxinoscope
// https://www.amazon.com/4M-3474-Animation-Praxinoscope/dp/B000P02HYC
// https://www.walmart.com/ip/Animation-Praxinoscope-Science-Kits-by-Toysmith-3474/45681503
// Developed for Processing 3.3.6 * http://processing.org
// 23 January 2018 * Golan Levin 
 
// See information about Processing PDF export at: 
// https://processing.org/reference/libraries/pdf/index.html
// PDF generated by Processing can be opened in Adobe Illustrator.
import processing.pdf.*;
boolean bRecordingPDF = false;
 
float inch = 72; 
float diamArtInner = inch * 1.50; 
float diamArtOuter = inch * 4.80; 
float diamCutInner = inch * 1.41; 
float diamCutOuter = inch * 4.875; 
float holeDy = inch * 0.23;
float holeDx = inch * 0.20;
float holeD = inch * 0.1;
 
final int nFrames = 10; 
int myFrameCount = 0;
int exportFrameCount = 0; 
boolean bAnimate = true; 
boolean bExportFrameImages = false;
 
//-------------------------------------------------------
void setup() {
  size(792, 612); // 11x8.5" at 72DPI
  frameRate(15);
  smooth();
} 
 
//-------------------------------------------------------
void draw() {
  background(240); 
  if (bRecordingPDF) {
    beginRecord(PDF, "praxinoscope-output.pdf");
  }
 
  // Do all the drawing. 
  pushMatrix(); 
  translate(width/2, height/2);
  drawCutLines(); 
  drawGuides(); 
  drawAllFrames();
  popMatrix();
 
  if (bExportFrameImages) {
    // If activated, export .PNG frames 
    if (exportFrameCount < nFrames) { String filename = "frame_" + nf((exportFrameCount%nFrames), 3) + ".png"; saveFrame("frames/" + filename); println("Saved: " + filename); exportFrameCount++; if (exportFrameCount >= nFrames) {
        bExportFrameImages = false;
        exportFrameCount = 0;
      }
    }
  }
 
  if (bRecordingPDF) {
    endRecord();
    bRecordingPDF = false;
  }
}
 
 
//-------------------------------------------------------
void keyPressed() {
  switch (key) {
  case ' ': 
    // Press spacebar to pause/unpause the animation. 
    bAnimate = !bAnimate;
    break;
 
  case 'p': 
  case 'P':
    // Press 'p' to export a PDF for the Praxinoscope.
    bRecordingPDF = true; 
    break;
 
  case 'f': 
  case 'F': 
    // Press 'f' to export .png Frames (to make an animated .GIF)
    myFrameCount = 0; 
    exportFrameCount = 0; 
    bExportFrameImages = true;
    bAnimate = true; 
    break;
  }
}
 
//-------------------------------------------------------
void drawCutLines() {
  fill(0); 
  textAlign(CENTER, BOTTOM); 
  text("Praxinoscope Template", 0, 0-diamCutOuter/2-6); 
 
  stroke(0); 
  strokeWeight(1.0);
 
  noFill(); 
  if (!bRecordingPDF) {
    fill(255); 
  }
  ellipse(0, 0, diamCutOuter, diamCutOuter);
 
  noFill(); 
  if (!bRecordingPDF) {
    fill(240); 
  }
  ellipse(0, 0, diamCutInner, diamCutInner);
 
  noFill(); 
  ellipse(diamCutOuter/2 - holeDx, 0-holeDy, holeD, holeD); 
 
  line (diamCutInner/2, 0, diamCutOuter/2, 0);
}
 
//-------------------------------------------------------
void drawGuides() {
  // This function draws the guidelines. 
  // Don't draw these when we're exporting the PDF. 
  if (!bRecordingPDF) {
 
    noFill(); 
    stroke(128); 
    strokeWeight(0.2); 
    ellipse(0, 0, diamArtInner, diamArtInner); 
    ellipse(0, 0, diamArtOuter, diamArtOuter);
 
    for (int i=0; i<nFrames; i++) {
      float angle = map(i, 0, nFrames, 0, TWO_PI); 
      float pxi = diamArtInner/2 * cos(angle);
      float pyi = diamArtInner/2 * sin(angle);
      float pxo = diamArtOuter/2 * cos(angle);
      float pyo = diamArtOuter/2 * sin(angle);
      stroke(128); 
      strokeWeight(0.2);
      line (pxi, pyi, pxo, pyo);
    }
 
    // Draw the red wedge outline, highlighting the main view.
    int redWedge = 7; // assuming nFrames = 10
    for (int i=redWedge; i<=(redWedge+1); i++) {
      float angle = map(i, 0, nFrames, 0, TWO_PI); 
      float pxi = diamArtInner/2 * cos(angle);
      float pyi = diamArtInner/2 * sin(angle);
      float pxo = diamArtOuter/2 * cos(angle);
      float pyo = diamArtOuter/2 * sin(angle);
      stroke(255, 0, 0); 
      strokeWeight(2.0);
      line (pxi, pyi, pxo, pyo);
    }
    noFill(); 
    stroke(255, 0, 0); 
    strokeWeight(2.0);
    float startAngle = redWedge*TWO_PI/nFrames;
    float endAngle = (redWedge+1)*TWO_PI/nFrames;
    arc(0, 0, diamArtInner, diamArtInner, startAngle, endAngle); 
    arc(0, 0, diamArtOuter, diamArtOuter, startAngle, endAngle); 
 
 
    for (int i=0; i<nFrames; i++) {
      float angle = map(i, 0, nFrames, 0, TWO_PI); 
 
      pushMatrix();
      rotate(angle); 
      float originY = ((diamArtOuter + diamArtInner)/2)/2;
      translate(0, 0-originY); 
 
      noFill(); 
      stroke(128); 
      strokeWeight(0.2);
      line (-inch/2, 0, inch/2, 0); 
      line (0, -inch/2, 0, inch/2); 
 
      popMatrix();
    }
  }
}
 
//-------------------------------------------------------
void drawAllFrames() {
  for (int i=0; i<nFrames; i++) {
    float angle = map(i, 0, nFrames, 0, TWO_PI); 
    float originY = ((diamArtOuter + diamArtInner)/2)/2;
 
    pushMatrix();
    rotate(angle); 
    translate(0, 0-originY); 
    scale(0.8, 0.8); // feel free to ditch this 
 
    int whichFrame = i; 
    if (bAnimate) {
      whichFrame = (i+myFrameCount)%nFrames;
    }
    drawArtFrame (whichFrame); 
    // drawArtFrameAlternate (whichFrame); 
 
    popMatrix();
  }
  myFrameCount++;
}
 
 
//-------------------------------------------------------
void drawArtFrame (int whichFrame) { 
  // Draw the artwork for a generic frame of the Praxinoscope, 
  // given the framenumber (whichFrame) out of nFrames.
  // NOTE #1: The "origin" for the frame is in the center of the wedge.
  // NOTE #2: Remember that everything will appear upside-down!
 
  // Draw the frame number
  fill(0); 
  noStroke(); 
  textAlign(CENTER, CENTER); 
 
  // Draw some expanding boxes, centered on the local origin
  int nBoxes = 32;
 
  float ry=0;
  float rx=14;
  float rs=11;
  float W1 = whichFrame%nBoxes;
  float Yoffset = 15;
  stroke(0);
  strokeWeight(2.5);
  line(-45,ry, 45,ry);
 
  fill(0);
  strokeWeight(1);
  line(0-rx-2, ry, 0-rx-2, ry+40);
  ellipse(0-rx-2, ry+(W1), rs, rs*5);
  ellipse(0, ry, rs, rs*5);
  ellipse(rx, ry, rs, rs*5);
  ellipse(rx*2, ry, rs, rs*5);
  pushMatrix();
  rotate((7*PI)/4);
  ellipse(18, -3, 10, 35);
  ellipse(28, 8, 10, 35);
  popMatrix();
  pushMatrix();
  rotate((13*PI)/7);
  ellipse(36, 5, 10, 30);
  popMatrix();
 
  fill(255);
  noStroke();
  rect(0-rx-5, ry+(W1)+Yoffset,6.7,8,15,15,45,45);
  fill(200,200,200);
  stroke(255);
  strokeWeight(1);
  line(0-rx-2-rs/4, ry+(W1)+Yoffset-4, 0-rx-2+rs/4, ry+(W1)+Yoffset-4);
  line(0-rx-2-rs/4, ry+(W1)+Yoffset-6, 0-rx-2+rs/4, ry+(W1)+Yoffset-6);
 
  for (int i=0; i<3; i++){
    fill(255);
    rect(-3+14*i, ry+Yoffset,6.7,8,15,15,45,45);
    line(3+14*i, ry+Yoffset-4, 0-rs/4+14*i, ry+Yoffset-4);
    line(3+14*i, ry+Yoffset-6, 0-rs/4+14*i, ry+Yoffset-6);
  }
 
 
}
 
//-------------------------------------------------------
void drawArtFrameAlternate(int whichFrame) { 
  // An alternate drawing test. 
  // Draw a falling object. 
 
 
  // Draw a little splat on the frame when it hits the ground. 
  if (whichFrame == (nFrames-1)) {
    stroke(0, 0, 0); 
    strokeWeight(0.5); 
    int nL = 10;
    for (int i=0; i<nL; i++) {
      float a = HALF_PI + map(i, 0, nL-1, 0, TWO_PI);
      float cx = 12 * cos(a);
      float cy = 10 * sin(a); 
      float dx = 16 * cos(a);
      float dy = 13 * sin(a); 
      line (cx, 45+cy, dx, 45+dy);
    }
  }
 
  // Draw a little box frame
  fill(255); 
  stroke(0, 0, 0);
  strokeWeight(1); 
  rect(-5, -50, 10, 100); 
 
  // Make the puck accelerate downward
  float t = map(whichFrame, 0, nFrames-1, 0, 1); 
  float t2 = pow(t, 2.0); 
  float rh = 8 + whichFrame * 0.5; // wee stretch
  float ry = map(t2, 0, 1, 0, 100-rh) - 50; 
 
  noStroke(); 
  fill(0, 0, 0);
  rect(-5, ry, 10, rh);
}

airsun-Interruptions

This project is tricky at first because I was not really sure about how to use the concept of noise as a primary function in accomplishing the assignment. It was interesting to really understand this concept throughout the project. And it is also interesting to see how some basic math concepts such as cosine and sine were also part of the knowledge we may need for the assignment. Moreover, the finishing product can be very artistic or rational based on individual choices, with all the lines bending and connecting each other like hairs or strictly similar to the original one.

//Clair(sijing) Sun
//sijings@andrew.cmu.edu


// Starter Code for "Embedded Iteration + Randomness"


var boolDoRefresh;
var n=0;
var randomD;
var gridSize = 9;
var degreeR;
var lineLength=17;
var xoff=0.5;
var poff=0.01;

function setup() {
    createCanvas(480, 480);
    boolDoRefresh = false;
    
}


function draw(){
    background(250);
    for (var x = gridSize; x <= width - gridSize; x += gridSize) {
            for (var y = gridSize; y <= height - gridSize; y += gridSize) {
                angleMode(DEGREES);
                degreeR=noise(x*xoff,y*xoff);
                degreeR=degreeR*180;
                randomD=noise(x*poff,y*poff);
                if (randomD < 0.65){
                    line(x, y, x+cos(degreeR)*lineLength, y+sin(degreeR)*lineLength);  
                }

        }
    }
}


 
function mousePressed() {
    noiseSeed();
    redraw();
}
 


airsun-Intersections

 

//Clair(sijing) Sun
//sijings@andrew.cmu.edu
//Assignment-5

var lineP=[];
var boolDoRefresh;
var numberofL=10;

function setup() {
    createCanvas(480, 480);
    boolDoRefresh = false;
    for (var i=0; i<numberofL; i++){
        var x1 = random(0,width);
        var x2 = random(0,width);
        var y1 = random(0,height);
        var y2 = random(9,height);
        lineP[i] = [x1,y1,x2,y2];
    }
}

function draw() {
    background(200);

    //regenerate if mousePressed
    if (boolDoRefresh) { 
        for (var i=0; i<numberofL; i++){
            var x1 = random(0,width);
            var x2 = random(0,width);
            var y1 = random(0,height);
            var y2 = random(9,height);
            lineP[i] = [x1,y1,x2,y2];
        }
    boolDoRefresh=false
    }

    //for drawing the lines
    for (var i = 0; i < numberofL; i += 1){
        line(lineP[i][0],lineP[i][1],lineP[i][2],lineP[i][3]);
    }

    //for drawing the intersections, spliting to two lines each time
    for (var j = 0; j < numberofL; j += 1){
            for (var i = 0; i < numberofL; i += 1){
                var x1 = lineP[j][0];
                var y1 = lineP[j][1];
                var x2 = lineP[j][2];
                var y2 = lineP[j][3];
                var x3 = lineP[i][0];
                var y3 = lineP[i][1];
                var x4 = lineP[i][2];
                var y4 = lineP[i][3];
                intersect(x1, y1, x2, y2, x3, y3, x4, y4);
            }
    }
    
}

// Modified from line intercept math by Paul Bourke http://paulbourke.net/geometry/pointlineplane/
// Determine the intersection point of two line segments
// Modified from http://paulbourke.net/geometry/pointlineplane/javascript.txt
function intersect(x1, y1, x2, y2, x3, y3, x4, y4) {
    if ((x1 == x2 && y1 == y2) || (x3 == x4 && y3 == y4)) {
        return false
    }

    var denom = ((y4 - y3) * (x2 - x1) - (x4 - x3) * (y2 - y1))
 
    if (denom === 0) {
        return false
    }
 
    var ua = ((x4 - x3) * (y1 - y3) - (y4 - y3) * (x1 - x3)) / denom
    var ub = ((x2 - x1) * (y1 - y3) - (y2 - y1) * (x1 - x3)) / denom
 
    // is the intersection along the segments
    if (ua < 0 || ua > 1 || ub < 0 || ub > 1) {
        return false
    }

    fill(0);
    ellipse (x1 + ua * (x2 - x1),y1 + ua * (y2 - y1),15);
}


 
function mousePressed() {
    boolDoRefresh = true;
}