Realisations of smart environments such as Smart Cities and Smart Buildings bring the promise of an intelligently managed space that maximise the requirement of users (i.e. citizen commute time, building occupant comfort) while minimising the resources required (i.e. transportation costs, energy costs, pollution, etc.).These smart environments were foreseen by Weiser in his work on ubiquitous computing in the 1980s at Xerox PARC and are physical worlds that are interwoven with sensors, actuators, displays and computational elements, embedded seamlessly into everyday objects and connected through a continuous network. It is only now, 30 years later, that elements of this vision are now being created through wireless internet enabled mobile devices.
Enabled by this constant connectivity the concept of human-in-the-loop sensing has emerged and Professor Amit Sheth has described the phenomenon as citizen sensing and as “…humans as citizens on the ubiquitous Web, acting as sensors and sharing their observations and views using mobile devices and Web 2.0 services”. In parallel to the emergence of citizen sensing is research into and the realisation of wireless sensor networks and the Internet of Things (IoT). The IoT vision relies on smart connected devices including sensors constantly connected and relaying data for real time analysis.
It is from the combination of research into human-in-the-loop sensing (citizen sensing) and the IoT that led us to develop the concept of citizen actuation. While citizen sensors generally only report on events in their surroundings we try to create actionable knowledge to allow people to make informed decisions through direct or indirect feedback. We define citizen actuation as the activation of a human being as the mechanism by which a control system acts upon the environment.
The figure shows a feedback loop divided into four distinct stages – the evidence stage collects the data and processes it for presenting to the user. The second stage relays the information to the user with richer context. This can be through visual representations like graphs, signs, or even warnings. A good example of this is a speed sign that measures a car’s current speed displaying it to the driver in comparison to the speed limit. The third stage is consequence, which shows the gain from what the user has reported. The final stage is action, where the user completes an action or makes a choice then this action/choice has a reaction and the feedback loop can restart. The figure also shows where we see as the division of where citizen sensing and citizen actuation take place within the loop.
We envisage citizen actuators as engaged citizens within any community, organisation, or any people connected through some shared place or object. In our research we developed and ran an experiment using citizen actuation as a component of a Cyber-Physical Social System (CPSS). This CPSS combined IoT technologies, Linked Data, and social media (Twitter in our experiment) to measure and track energy usage in the Digital Enterprise Research Institute (DERI ) building in NUI Galway and alerted building occupants of high energy usage outside of office hours through a tweet of the form “Hi @username could you check if lights/air-con were left on in the 1st Floor North Wing please? And turn them off if possible. Thanks”. The user then if, available, then can chose to check and turn off any equipment in the area. During our experiment energy usage dropped by approximately 24% compared to a baseline set over a four months.
List of Sensors available on the Galaxy Nexus code from this post used
But whenever I tried to run a simple program to display the sensor output the activity crashed….due to a null pointer exception. Read a few posts and none was very clear….some mentioned that a gyroscope was needed…
Found this post which implies the device needs a gyroscope to implement a Rotation Vector Sensor.
Not 100% but as much information as I can find online!
Talk on Android Sensor Fusion – combined hardware sensors to get virtual sensor data
Code below displays all available sensors (hardware and virtual)
main.xml (just a textview to show the list of sensors)