Automatic Supervision of Temperature , Humidity , and Luminance with an Assistant Personal Robot

Smart environments and Ambient Intelligence (AmI) technologies are defining the future society where energy optimization and intelligent management are essential for a sustainable advance. Mobile robotics is also making an important contribution to this advance with the integration of sensors and intelligent processing algorithms. This paper presents the application of an Assistant Personal Robot (APR) as an autonomous agent for temperature, humidity, and luminance supervision in human-frequented areas. The robotmultiagent capabilities allow gathering sensor informationwhile exploring or performing specific tasks and then verifying human comfortability levels.The proposed methodology creates information maps with the distribution of temperature, humidity, and luminance and interprets such information in terms of comfort and warns about corrective actuations if required.


Introduction
Human health and comfortability are usually related to the ambient conditions where an individual spends long periods of time during the day.Advances on Ambient Intelligence (AmI) have fostered the implementation of smart environments that are able to detect and react to human presence.The concept of smart environments is usually focused on energy optimization, especially on high-demanding buildings and facilities.One of the objectives of AmI is to maintain an efficient use of the energy resources by dynamically adjusting the behavior of different actuators that control the environmental conditions through the detection and adaptation to human presence.The architecture of such systems is usually based on sensor networks converging into a specialized computational unit that performs a real time analysis of the gathered data, adjusting the behavior of the different actuators located inside the building to match the desired values for each specific situation.The paper [1] presented a case study of energy management in an intelligent building based on the integration of Wireless Sensor Networks (WSN).In this line, having an efficient management of the lighting control system of buildings is essential for energy optimization and it has already been addressed in the literature [2].In addition, it has been proved that the use of WSN for low-cost energy optimization of green building lighting systems is a viable option to consider [3].The maintenance of air quality and thermal conditions in buildings is also an important part of the energy consumed due to the high power demand of ventilation and air conditioning units (HVAC).The paper [4] proposed the implementation of a neurofuzzy controller for thermal comfort regulation in an office building in which a predictor model is used in order to keep constant comfort despite the thermal adaptation time required by HVAC systems.Additionally, the implementation of sensor network technologies is also considered on urban scenarios to, for example, monitor the air quality at different urban districts [5] and traffic monitoring [6].
The complexity and size of fixed network sensors are on the rise [7]; this has promoted/encouraged the research for new communication methodologies in an effort to create a common conceptual space among different heterogeneous devices such as sensors, actuators, processing units, storage units, and terminals.The Internet of Things (IoT) concept was born from this need and has allowed an easy integration of multiple/various devices working together within a common intelligent environment [8].Since the popularization of the AmI technologies, those devices are often used in household domains in order to enable a smart management of comfort, healthcare, security, and energy saving.The resulting implementation of this methodology is popularly known as smart home [9].The global market actually offers simple smart home solutions for nonexpert users in an effort to reach the general public, encouraging the development of new easy-to-install proposals [10].Most of such implementations are focused on providing healthcare solutions for elderly people or for people with diseases [11].These approaches on healthcare smart environments are specifically designed to provide several and different advantages aimed at increasing self-sufficiency and quality of life of their users.
The integration of mobile robots in smart environments is a challenging topic that has as main objective the extension of the concept "smart environment" by means of the integration of physical mobile platforms capable of interacting with users, environments, and other devices.The ENRICHME project [12] is an example concept that combines robotics and other intelligent systems in order to supervise and improve the quality of life of fragile elderly people.The paper [13] presented a smart home which monitors and analyzes the environmental conditions to generate and deliver sets of sequences that are used to control a service robot operating inside the home.The use of service robots for performing healthcare routines is also often proposed in the scientific literature.For example, in [14] a robot uses its computer vision system for fall detection.In the same line, other service robots were proposed for its integration in a smart environment like the Hobbit robot [15] which has multiple implemented assistive services and robust autonomous capabilities.In addition, robot-assistive services for shopping delivery and garbage collection are also addressed and discussed in [16] in which several elderly people were invited to participate in the experiments.As the ambient conditions can vary through different locations inside the same room, one of the main drawbacks of fixed-position sensor networks is establishing the appropriate number of measuring devices, as well as their distribution inside a common space.Embedding sensors onboard mobile robots which are capable of navigating around a predefined operating area is a practical solution to the scalability and redundancy problems of static network sensors.Since the navigation procedures of the robot requires it to be able to identify its position inside the explored area, this information can be also recorded along with the measured values, allowing it to create a link between the lectures and their position inside the map.This methodology provides flexibility when acquiring information about the environment that can be very useful in some applications.For example, in [17] a cloudbased service is proposed for environmental monitoring of data centers by means of performing explorations with autonomous robots.In [18] a mobile robot was proposed as an intelligent firefighting humanoid robot capable of distinguishing fire sources, smoke, and thermal reflections by means of processing images from a thermal infrared camera.Moreover, surveillance and intelligent control of outdoor real environments are addressed in [19] by using multiple sensorequipped mobile robot agents robustly designed for land patrolling.
This paper proposes the development of an ambient supervision application as a complementary functionality of an Assistant Personal Robot (APR), a humanoid-shaped assistive robot [20].Assistive robots are often conceived as polyvalent platforms capable of performing different kinds of assistive routines that can be dynamically programmed on demand by using its available and embedded tools.This complementary application is focused on keeping the robot aware of its ambient conditions: temperature, humidity, and luminance when performing conventional tasks.The main objective of the proposed methodology is to provide accurate maps with the ambient information obtained by the mobile robot while performing any task in order to warn about uncomfortable conditions.In the future, the APR will use this information in order to adjust the ambient conditions by performing corrective actions.

Assistant Personal Robot
The platform used in this paper is the second prototype of the APR robot (named APR-02).This mobile robot is an improvement of the initial design, focused on providing telepresence services controlled by a human operator [20].The APR-02 (Figure 1) is designed for the development of full autonomous tasks and is equipped with an improved computational unit, an improved mechanical structure, and an improved motor control board, as well as being compatible with a wider range of sensors.The motion of the mobile robot is defined by a holonomic system composed of three omnidirectional wheels, allowing it to perform complex maneuvers in narrow indoor areas [21].The mobile robot has a Hokuyo UTM-30LX laser range sensor (LIDAR) with a native resolution of 1 mm which is used for robot localization, mapping, and obstacle avoidance.The main computational unit is a full computer (Intel Core i7-6700 K, 16 GB DDR4, SSD Hard Disk) set up as the main control for all devices.The robot control system is based on a multiagent architecture [22] that enables the robot to execute multiple processes usually required to perform the designed assistive tasks in an effective way.This multiprocess implementation scheme also provides versatility and enables the possibility of managing multiple robot procedures and services at once.

Methodology
The applied methodology for the proposed AmI application is based on the creation of dynamic maps that provide a visual representation of the ambient information measured by onboard mobile robot sensors along the explored area.The APR-02 executes an autonomous navigation procedure that identifies its location inside the map that represents the area of operation; additionally, this procedure also computes the trajectory that the robot must follow to reach its target destination.During the explorations, the robot records the lectures of the desired sensors as well as the location at which each lecture was obtained, allowing the creation of distribution maps for each monitored parameter.
3.1.Navigation.One of the most important features of the APR-02 is its capability to perform complex autonomous tasks.Most of such tasks require information about the physical layout around the robot, as well as a constant access to its current location.The first step to achieve an autonomous navigation system is to build a virtual map of the area of operation; once the map is built, the robot will automatically store it as a reference, allowing it to be loaded every time the robot needs to operate in that specific area.The methodology used for the mapping procedures is based on the Simultaneous Localization and Mapping (SLAM) method which processes laser range data provided by the 2D LIDAR device onboard the robot in order to compute the relative position of the robot while creating a two-dimensional model of the explored area with a resolution of 1 mm.The SLAM method used in this paper is based on the FastSLAM approach [23] with a customized Iterative Closest Point (ICP) algorithm for laser sample alignment [24].
The navigation process is another parallel process running on the control system.This process queries the current position of the robot obtained through the SLAM process and then computes the path to the current destination in the map.The path planning is performed by means of running an informed search algorithm (A * algorithm [25] with a Manhattan heuristic) on a node-discretized version of the virtual map. Figure 2 shows an example of virtual map created by the APR in a first exploration.The colored area in this map depicts the navigable nodes along with their associated weights.The weights of the nodes are assigned in relation to its safeness; for this, nodes located near obstacles are penalized with an additional cost.This method ensures that the path planning algorithm will prioritize safety over distance, maintaining the robot as far as possible from walls and other obstacles if possible.In addition, the navigation process is aware of unexpected obstacles detected by the laser range sensor in order to avoid possible collisions.

Information Maps.
This paper proposes the creation and use of information maps for ambient supervision.The proposed methodology is based on the work presented in [26] which was focused on the detection of gas leakages.In this case, the information maps are computed from raw sensor data gathered at 1 Hz constantly by the robot while performing other assistive tasks or during specific ambient supervision explorations.Figure 3 shows a picture of the ARM-based embedded sensor board designed as a module to engage communications with the APR and to provide information of the temperature, humidity, and luminance.This embedded sensor system has been designed as a versatile USB accessory sensor system that can be plugged in different locations of the mobile robot according to the environmental requirements.This sensor system is also designed in order to provide redundant or complementary ambient information, for example, from the front or from the back of the mobile robot.
The data obtained by the embedded sensors are processed and also stored in a custom structure which contains the raw lecture from each sensor along with the current robot coordinates and the timestamp that identifies the time at which the lecture was obtained.The information maps are generated from a combination of the localized sensor samples and the virtual area map built by the SLAM procedure.The sensor samples are integrated into the virtual map according to its relative two-dimensional coordinates with a resolution of 1 mm.At this point, the samples are interpolated by using the biharmonic spline method [27] in order to fill the virtual map boundaries.As a result, these information maps present the ambient information in an effective manner that can be visually presented or automatically processed in order to define inputs for further corrective actuations.The accuracy  of this methodology mainly depends on the number of samples gathered by the robot inside the supervised area.Nevertheless, this application does not require a saturated patrolling in order to obtain profitable results.In this paper, the representation of the ambient conditions has been estimated by simply passing once through a corridor.Furthermore, the information maps are segmented in different areas (corridor, hall, office, restroom, etc.) with the different human comfortability levels defined in [28].It is interesting to note that comfortability ranges are not fixed and may vary by season.

Experimentation and Results
The experimental stage of this work is carried out in the 2nd floor of the Polytechnic School of the University of Lleida.The mobile robot was programed to perform an ambient control routine patrol in this floor.The virtual map of the scenario was previously built and available for its use on each robot mission inside this area.The robot navigation planning can be defined by setting a list of key nodes that the robot must visit in the specified order before returning to its initial position to conclude its exploration.In this experiment, the robot performs a predefined cyclic exploration which starts in a base position in one laboratory; then the robot must leave the laboratory, reach the emergency exit at the end of the hallway, visit the hall entrance, and return to its starting point inside the laboratory.Figure 4 shows the trajectory of the APR in the virtual map of the scenario of the proposed experiment.The mobile robot spends 760.8 seconds in this exploration, gathers a total of 1692 samples, and travels a total of 98.08 meters.
Depending on the application planned, the information maps with temperature, humidity, and luminance can be computed online or after finishing the ambient supervision mission.The results from the trajectory described are shown in Figures 5, 6, and 7 in which the colors of the mesh are relative to the maximum and minimum measured valuesduring the whole exploration.These maps provide accurate information of the ambient conditions that can be stored, processed, or compared in further analysis.The mesh and colors represent value variations along the explored and measured area.This information can be also processed in order to provide specific information of the human comfortability ranges in the different areas explored.As a result, this segmented information maps are easy to interpret as the color directly warns about the necessity to implement corrective actuations.Figure 8 shows the results of the comfortability segmentation procedure proposed in which green areas depict an adequate ambient condition.Reddish colored zones in temperature and humidity maps identify the areas with values over the recommended comfort levels, while bluish colored zones identify areas with lectures below the recommended ranges.In the case of the luminance map a reddish color depicts areas with low lighting levels.Finally, the information map of Figure 8(c) showed that the luminance levels of the corridor of the facility were not adequate according to predefined comfort levels.In this case, the exploration was performed in an afternoon during a stormy day just in the moment in which the natural illumination provided by the sun was clearly insufficient for a public facility.In this case, a further interconnection of the APR with the main systems of the building can be used to 6 Journal of Sensors automatically correct this situation by automatically turning on and off the illumination of the corridor of the facility.

Conclusions
This work presents an AmI application based on the use of an Assistant Personal Robot for the supervision of the temperature, humidity, and luminance conditions.This application is conceived as a complementary function for an intelligent assistant mobile robot capable of developing autonomous tasks in indoor environments.Intelligent robots such as the one used for this paper provide high computational capabilities and enough resources to enable a multiagent control system capable of setting up different robot processes simultaneously.This methodology allows the robot to work on its main assistive tasks while executing an ambient supervision routine which is constantly gathering and processing additional sensor data.This method enables the creation of information maps combining sensor data and the virtual map of the experimental area based on the application of an ICP-based SLAM procedure.The information maps provide detailed and localized ambient data that can be segmented according to comfortability levels.The presented AmI system can be connected to a main control unit that manages the functionalities of multiple actuators (dimmable lights, air conditioning systems, ventilation, blinds control, etc.) to perform the corresponding corrective actions to maintain the comfortability levels inside the recommended range.
Future work will be focused on the implementation of a robust AmI framework conforming with the presented methodology in order to build embedded intelligent control systems.Results from information maps should also be interpreted by the AmI system in order to detect ambient disconformities and to maintain a constant ambient supervision.Future work will also consider the combination of multiple robots working in the same or in different floors of the same building, studying the possibility of creating a multiagent system that manages the ambience conditions of an entire building.

Figure 2 :
Figure 2: Virtual map created by the robot and the weighted navigable area.

Figure 3 :
Figure 3: Embedded ARM-based sensor system capable of sampling temperature, humidity, and luminance.

Figure 4 :
Figure 4: Path followed by the APR during the ambient supervision experiment.

Figure 5 :Figure 6 :Figure 7 :
Figure 5: Temperature information map from the experimentation area.