Add like
Add dislike
Add to saved papers

Stress experiences in neighborhood and social environments (SENSE): a pilot study to integrate the quantified self with citizen science to improve the built environment and health.

BACKGROUND: Identifying elements of one's environment-observable and unobservable-that contribute to chronic stress including the perception of comfort and discomfort associated with different settings, presents many methodological and analytical challenges. However, it also presents an opportunity to engage the public in collecting and analyzing their own geospatial and biometric data to increase community member understanding of their local environments and activate potential environmental improvements. In this first-generation project, we developed a methodology to integrate geospatial technology with biometric sensing within a previously developed, evidence-based "citizen science" protocol, called "Our Voice." Participants used a smartphone/tablet-based application, called the Discovery Tool (DT), to collect photos and audio narratives about elements of the built environment that contributed to or detracted from their well-being. A wrist-worn sensor (Empatica E4) was used to collect time-stamped data, including 3-axis accelerometry, skin temperature, blood volume pressure, heart rate, heartbeat inter-beat interval, and electrodermal activity (EDA). Open-source R packages were employed to automatically organize, clean, geocode, and visualize the biometric data.

RESULTS: In total, 14 adults (8 women, 6 men) were successfully recruited to participate in the investigation. Participants recorded 174 images and 124 audio files with the DT. Among captured images with a participant-determined positive or negative rating (n = 131), over half were positive (58.8%, n = 77). Within-participant positive/negative rating ratios were similar, with most participants rating 53.0% of their images as positive (SD 21.4%). Significant spatial clusters of positive and negative photos were identified using the Getis-Ord Gi* local statistic, and significant associations between participant EDA and distance to DT photos, and street and land use characteristics were also observed with linear mixed models. Interactive data maps allowed participants to (1) reflect on data collected during the neighborhood walk, (2) see how EDA levels changed over the course of the walk in relation to objective neighborhood features (using basemap and DT app photos), and (3) compare their data to other participants along the same route.

CONCLUSIONS: Participants identified a variety of social and environmental features that contributed to or detracted from their well-being. This initial investigation sets the stage for further research combining qualitative and quantitative data capture and interpretation to identify objective and perceived elements of the built environment influence our embodied experience in different settings. It provides a systematic process for simultaneously collecting multiple kinds of data, and lays a foundation for future statistical and spatial analyses in addition to more in-depth interpretation of how these responses vary within and between individuals.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app