We directly experience an environment via the Umwelt created from our perception. We can note further phenomena outside of our perception through deductive reasoning (I can’t see UV radiation, but I can recognise its presence through a sunburn) or through technological means, we might expand our perception.
Investigate an invisible phenomenon in Stadionbrache by recording sensor data in those spaces. Briefly analyse your findings and visualise the results for the class.
Hook up a sensor of your choice to the MKR1310 and record the data using examples on the Github. Feel free to use the environmental combo, which is already set up in the example.
Find an appropriate library and run the example for the sensor of your choosing. Modify the example to print out comma-separated sensor values over serial at 9600 baud in the following format. The github rep has an example already set up for the environmental combo.
Serial.print("{\"sensorNameOne\":"); Serial.print(sensorData1); Serial.print(","); Serial.print("\"sensorNameTwo\":"); Serial.print(sensorData2); Serial.print("}"); Serial.println();
On one computer install and run the JsonReciever Processing sketch to record the sensor data to a JSON file.
Take some notes and photos of where you recorded the data. Be careful to name and organise your JSON files!
Analyse the sensor data and choose an appropriate visualization. This can be done through an analog sketch or with digital tools. Use the JsonAverageValues Processing sketch to get a quick average of the values from one place.
What can you conclude about the spaces that were not immediately visible?
On what time scale do the phenomena vary and to what extent do humans interact with these phenomena?
What is an appropriate means of making this phenomenon perceptible?
What leverage points exist to modify this phenomenon?