martes, 17 de enero de 2017

Comparative Analysis of Five Observational Audit Tools to Assess the Physical Environment of Parks for Physical Activity, 2016

Comparative Analysis of Five Observational Audit Tools to Assess the Physical Environment of Parks for Physical Activity, 2016

Centers for Disease Control and Prevention. CDC twenty four seven. Saving Lives, Protecting People



 PCD logo



Comparative Analysis of Five Observational Audit Tools to Assess the Physical Environment of Parks for Physical Activity, 2016

Rodney P. Joseph, PhD1; Jay E. Maddock, PhD2 (View author affiliations)

Suggested citation for this article: Joseph RP, Maddock JE. Comparative Analysis of Five Observational Audit Tools to Assess the Physical Environment of Parks for Physical Activity, 2016. Prev Chronic Dis 2016;13:160176. DOI: http://dx.doi.org/10.5888/pcd13.160176.
PEER REVIEWED

Abstract

We reviewed prominent audit tools used to assess the physical environment of parks and their potential to promote physical activity. To accomplish this, we manually searched the Active Living Research website (http://www.activelivingresearch.com) for published observational audit tools that evaluate the physical environment of parks, and we reviewed park audit tools used in studies included in a systematic review of observational park-based physical activity studies. We identified 5 observational audit tools for review: Bedimo-Rung Assessment Tool–Direct Observation (BRAT-DO), Community Park Audit Tool (CPAT), Environmental Assessment of Public Recreation Spaces (EAPRS) tool, Physical Activity Resource Assessment (PARA), and Quality of Public Open Space Tool (POST). All 5 tools have established inter-rater reliability estimates ranging from moderate to good. However, BRAT-DO is the only tool with published validity. We found substantial heterogeneity among the 5 in length, format, intended users, and specific items assessed. Researchers, practitioners, or community coalition members should review the goal of their specific project and match their goal with the most appropriate tool and the people who will be using it.
 Top

Introduction

The relationship between physical activity and health is well established (1–3); however, despite the benefits of physical activity, most Americans are insufficiently active (4). Parks have the potential to influence physical activity behaviors (5) because they are normally close to residential neighborhoods, have specific structures designed for physical activity, and usually can be accessed at no cost. However, because a park is present does not necessarily mean that it is supportive of physical activity. The degree to which parks encourage physical activity is largely determined by their design, their perceived safety, and the presence of park structures that support physical activity (6–9).
Researchers have developed various observational audit tools to examine the physical environment of parks and their potential to promote physical activity. However, the format, length (ie, number of items), assessment methods, literacy required to complete assessments, and characteristics and features of these tools vary widely. We reviewed 5 prominent audit tools for assessing the physical environment of parks to assist public health practitioners, researchers, and community members in determining which audit tool is best for them to use for a specific research project or evaluation.
 Top

Basis for Comparison

We conducted a selective review of environmental audit tools. To identify relevant measures, we manually searched the Active Living Research website (http://www.activelivingresearch.com) for published observational audit tools that evaluate the physical environment of park-based settings, and we reviewed tools used by researchers in studies included in a systematic review of observational studies of park-based physical activity (10). When selecting tools for inclusion in the review, we focused on those that evaluated both the open spaces and physical structures of parks. Our initial search procedures revealed 7 potential tools for inclusion in the review: Bedimo-Rung Assessment Tool–Direct Observation (BRAT-DO) (11), Community Park Audit Tool (CPAT) (12), Environmental Assessment of Public Recreation Spaces (EAPRS) tool (13), Physical Activity Resource Assessment (PARA) instrument (14), Path Environment Audit Tool (PEAT) (15), Rural Active Living Assessment (RALA) Tools (16), and Quality of Public Open Space Tool (POST) (17). After reviewing the tools and applying our inclusion criteria, we selected 5 for inclusion in the review: BRAT-DO (11), CPAT (12), EAPRS (13), PARA (14), and POST (17). PEAT and RALA tools were excluded because they did not include a detailed assessment of the physical features of parks (PEAT focuses on trails or paths and RALA focuses on the physical features and policy characteristics of entire towns and communities rather than individual parks).
In comparing the measures, we first evaluated the following characteristics: length, intended users, data collection method, reliability, and validity. Next, we performed a more detailed analysis of each tool based on specific features assessed. We reviewed each measure and classified items in the following categories: activity areas, sitting and resting features, landscape features, facilities (rest rooms, showers, event venues), eating and drinking features, access and neighborhood characteristics, signage, safety-related features, incivilities (evidence of offensive behavior, eg, litter, graffiti, loud noise), and the park’s condition and maintenance. We placed items not fitting into these categories in a category called “other items assessed.” In comparing the measures, our first level of review was to evaluate the following characteristics: length, intended users, data collection method, reliability, and validity. The second level of review was a more detailed analysis of each tool based on specific features assessed. Last, we compared the 5 tools on the basis of the outcomes assessed in the first 2 levels of review.
 Top

Overview of Audit Tools

Community Park Audit Tool (CPAT)

CPAT (http://www.activelivingresearch.com) comprises 28 items and is divided into 4 sections: park information, access and surrounding neighborhood, park activity areas, and park quality and safety. CPAT was designed for community members, stakeholders, and researchers to easily assess community parks with an emphasis on the physical activity of youths. Inter-rater reliability of the CPAT is generally moderate to high; most items have agreement ranging from 80% to 90% (12). CPAT was originally developed as a pencil-and-paper instrument (6 pages), but an electronic version is now available.

Bedimo-Rung Assessment Tool–Direct Observation (BRAT-DO)

BRAT-DO (http://publichealth.lsuhsc.edu/Faculty_Pages/rung/index_files/page0004.htm) is a pencil-and-paper tool designed to identify and evaluate the physical structures and characteristics of parks. BRAT-DO is intended for use by researchers, consists of 181 items (16 pages), and assesses the following domains: access, condition, aesthetics, features, and safety. The tool has established inter-rater reliability among people trained in the assessment protocol; inter-rater agreement across domains ranged from 67.6% to 100% (11). Validity was determined by comparing outcomes achieved by trained data collectors against experts. Average inter-rater agreement ranged from 68.3% to 88.3%.

Environmental Assessment of Public Recreation Spaces (EAPRS) Tool

EAPRS (13) (http://www.activelivingresearch.com) is the most comprehensive observational audit measure available. It is in its sixth revision and is available in pencil-and-paper format. EAPRS contains 751 items in 16 sections and is 59 pages long. Inter-rater reliability varies by section or domain assessed; however, most items in the tool demonstrate moderate-to-high reliability estimates (eg, κ ≥ .60) (13).

Physical Activity Resource Assessment (PARA) instrument

PARA (14) (http://grants.hhp.coe.uh.edu/undo/?page_id=21) is a brief, 1-page (49 items) instrument that assesses park type, features, amenities, qualities, and incivilities. It was originally developed to assess resources in low-income, urban communities that surround public housing developments (14). Use of PARA is not limited to assessment of parks or playgrounds and can be used to audit other physical environments. Inter-rater reliability is moderate to high (κ > .77).

Quality of Public Open Space Tool (POST)

POST (17) (http://www.see.uwa.edu.au/research/cbeh/projects/post) is a pencil-and-paper–based tool designed to evaluate physical attributes of public open spaces, including parks, that may influence physical activity. POST is 2.5 pages in length and consists of 88 items. Inter-rater agreement for the items ranges from 50% to 98%, with most items demonstrating greater than 85% agreement (18).
 Top

Comparative Analysis

We compared the 5 tools by length, intended users, data collection method, reliability, and validity (Table 1). PARA was the shortest at 1 page (49 items), and EAPRS was the longest at 59 pages (751 items). The other 3 measures ranged in length from 2.5 to 16 pages. All 5 are designed for use by researchers and public health practitioners; CPAT and PARA are also appropriate for community members. All 5 measures are available in pencil-and-paper format. CPAT is also available in an electronic format for use on tablet or smartphone. The assessment protocols for all 5 measures are similar: parks are divided into segments or activity areas, and each segment is evaluated individually.
All 5 measures have established reliability with moderate-to-good inter-rater reliability estimates (Table 1). BRAT-DO was the only measure with validity estimates determined by inter-rater agreement (ie, percentage agreement) between trained data collectors and expert data collectors (ie, data from expert data collectors were viewed as the gold standard or criterion). Overall domain validity for BRAT-DO (ie, for items assessing access, condition, aesthetics, features, and safety) was 78.7%, and overall geographic validity (ie, for items assessing streets, courts, green spaces, paths, playgrounds, and sports fields) was 81.5%.
 Top

Park Features Assessed

A detailed table illustrating the items assessed by each audit tool, which illustrates how we derived the information presented in Table 2, is available from the authors.
Children play areas. All 5 tools include at least 1 item to assess the presence of play structures or play sets. BRAT-DO, EAPRS, and POST provide the most detailed assessment of children play structures, because they assess specific types of structures (eg, swings, slides, climbing equipment).
Courts. All 5 tools assess the presence of courts; however, the type of court varies. All 5 assess basketball and tennis courts. Three (CPAT, EAPRS, and PARA) also assess volleyball courts. Only EAPRS assesses the presence of handball courts, and only POST assesses netball courts.
Fields and open green spaces. All 5 measures evaluate the presence of soccer and softball or baseball fields. Other field types evaluated by measures were football or rugby (CPAT, BRAT-DO, and EAPRS) and cricket (POST). CPAT, BRAT-DO, and EAPRS also included items to assess the presence of open green spaces and fields for general play.
Golf courses. EAPRS and BRAT-DO are the only 2 measures that assess the presence of golf courses. The EAPRS was the most detailed, because it assessed the presence of regular (9- or 18-hole), miniature, and Frisbee golf courses. BRAT-DO evaluated the presence of a regular golf course.
Running and walking features. All 5 measures assess the presence of unpaved trails or paths for running and/or walking activities. Two measures, PARA and EAPRS, also evaluate the presence of sidewalks.
Water features. Four measures, CPAT, BRAT-DO, EAPRS, and POST, evaluate the presence of ponds and lakes and creeks and streams within parks. CPAT, BRAT-DO, EAPRS, and PARA also include items to evaluate the presence of swimming and wading pools. CPAT is the only measure assessing splash pads. Only BRAT-DO and EAPRS assess whether the park is located on a beach or riverfront.
Other types of activity areas. The audit tools reviewed also assess the presence of other activity areas. For example, CPAT and PARA assess the presence of exercise stations or exercise equipment. Likewise, CPAT and EAPRS assess the presence of dedicated dog parks. EAPRS provides the most comprehensive assessment, because it is the only measure to assess batting cages, BMX (bicycle motocross) tracks, driving ranges, horseshoe pits, shuffleboards, and other features.
Sitting and resting features. All 5 tools evaluate the presence of picnic tables. All except PARA assess the presence of benches. EAPRS also assesses the presence of other types of sitting or resting features, including seat walls and bleachers.
Landscaping features. All measures include items to assess landscape features. Common items assessed are trees, flower and garden areas, fountains, and areas with shade.
Facilities. All 5 measures include items that evaluate the presence of restrooms and trash cans. BRAT-DO was the only measure to assess if play or athletic equipment is available for rent. PARA was the only measure to assess the presence of showers or changing rooms, and POST was the only measure to evaluate the presence of meeting rooms. EAPRS and POST also assessed the presence of event venues or stages.
Eating and drinking features. All five tools assess the presence of water fountains. BRAT-DO, EAPRS, and POST also evaluate the presence of grills or barbeques. All measures except PARA assess the presence of other food sources (eg, vending machines, concession stands).
Access and neighbor characteristics. All measures include items to evaluate park access and environmental supports for transit to parks. Common items include the presence of sidewalks or paths connecting to the park, adjacent roadways and parking lots, and the presence of bike racks. CPAT was the only measure to assess presence of a nearby public transit stop. Only 2 measures, CPAT and BRAT-DO, assess whether a park can be locked or is locked at the time of audit. All measures except PARA assess the quality of the neighborhood surrounding the park.
Signage. CPAT and BRAT-DO are the most detailed measures of signage, because they evaluate the presence of posted park rules and regulations, hours, event programming, dog and pet user rules, and information on how to reserve areas and other types of signage. PARA and POST are the least detailed. EAPRS assesses the presence of signs for rules and regulations, maps, and event programming.
Safety-related features. All measures assessed lighting. CPAT, BRAT-DO, and EAPRS also assess the presence of telephones and emergency call boxes and onsite park staff. CPAT was the only measure to assess threatening behavior among park users (ie, presence of gangs and alcohol or drug use). PARA and POST are the least detailed in assessing park safety features; they assess only presence of lighting.
Incivilities. All measures assessed the presence of incivilities. CPAT, BRAT-DO, PARA, and POST are the most detailed. Example items assessed were litter, graffiti, and loud noise. EAPRS included only a few items to assess incivilities.
Condition and maintenance. All measures except POST evaluate the condition and maintenance of parks. However, the method in which the measures assess these aspects differs. CPAT, BRAT-DO, and EAPRS first assess the presence of a specific park feature or amenity and then use a separate additional item to assess its condition and maintenance. PARA uses a combined scale to assess both the presence and condition of the features.
Dogs. All measures except the BRAT-DO include items to assess the presence of dogs. CPAT and POST further expand on this topic by including an item to assess if waste litter bags for dogs are present.
 Top

Considerations for Selecting an Audit Tool

Because of the heterogeneity of the 5 audit tools reviewed, practical guidance on the usefulness of a tool for specific research projects is warranted. Table 3 illustrates common audit scenarios faced by both researchers and public health practitioners and the corresponding audit tools best suited for each scenario.
 Top

Discussion

Our comparative analysis showed heterogeneity in the length of the audit tools. Two tools, PARA and CPAT, were designed for researchers, public health practitioners, and community members, where BRAT-DO, EAPRS, and POST were designed for researchers and public health practitioners only. A positive finding of our review was that all 5 audit tools have inter-rater reliability estimates that range from moderate to good. This finding is encouraging and speaks to the organization and clarity of the tools and their training manuals. In reference to validity, BRAT-DO is the only tool to report validity outcomes. Studies are needed to establish the validity of the other audit tools.
Choice of an audit measure for a project should be based on various factors, which include the level of detail needed, education and skill level of the people performing the assessment, time needed to complete the assessment, target population of park users (ie, children vs adults), and resources available to train assessors and to conduct the assessment. When choosing a tool, researchers will often face a trade-off among these factors, because each tool will not perfectly address all project specifications. Despite this trade-off, all 5 measures reviewed show promise for evaluating the built environment of parks for the support of physical activity.
Although comprehensive, our review has limitations. We limited our selection to audit tools used in studies included in our systematic review of park-based physical activity studies (10) and those available on the Active Living Research website. Therefore, other audit tools may be available that were not included in this review. Additionally, the process of comparing the audit tools was challenging because of the heterogeneity of the format, detail, and length of the tools. Accordingly, our comparative analysis oversimplified many of the park features assessed by each audit tool. This was particularly the case with our review of EAPRS, because this tool collects extensive detail for every feature or aspect assessed. We also would have liked to compare time required to audit a park of similar size and features using each tool. However, we were unable to perform this comparison because of the heterogeneity of parks used in the development and validation of the audit tools. Studies are needed to compare the approximate amount of time it takes to audit a standardized park with each tool.
Despite these limitations, this review has several strengths. To our knowledge, this is the first review of observational audit tools to evaluate the physical environment of parks in relation to physical activity. Findings are designed to help researchers, public health practitioners, and community coalitions identify an appropriate tool for a particular project. Another strength was the high level of detail used to evaluate the 5 measures. We compared various aspects of each tool, including length, assessment method, intended users, reliability, validity, and specific park features assessed. This level of detail strengthens the quality and usefulness of the review.
Our review was intended to provide guidance in selecting the best audit tool for a proposed park evaluation project. When selecting an appropriate tool, researchers should review the goal of their project and match their goal with the most appropriate tool, because the method of assessment will have a direct relationship with validity and usefulness of the outcomes. Likewise, researchers may find it necessary to use multiple audit tools or to combine sections of different audit tools to evaluate the specific features of a park environment.
 Top

Acknowledgments

Preparation of this article was supported by funding from the National Institutes of Health/National Heart, Lung, and Blood Institute (NIH/NHLBI), award K99 HL129012-01 (R. Joseph, principal investigator) and the NIH/National Institute on Nursing Research (NIH/NINR), award T32 1T32NR012718-01 Transdisciplinary Training in Health Disparities Science (C. Keller, principal investigator). The content is solely the responsibility of the authors and does not necessarily represent the official views of NIH.
 Top

Author Information

Corresponding Author: Rodney P. Joseph, PhD, College of Nursing and Health Innovation, Arizona State University, 500 N 3rd St, Phoenix, AZ 85004. Telephone: 602-496-0772. Email: Rodney.Joseph@asu.edu.
Author Affiliations: 1College of Nursing and Health Innovation, Arizona State University, Phoenix, Arizona. 2Department of Environmental and Occupational Health, School of Public Health, Texas A&M Health Science Center, College Station, Texas.
 Top

References

  1. US Department of Health and Human Services. Healthy people 2020. Washington (DC): Office of Disease Prevention and Health Promotion; 2010.
  2. Centers for Disease Control and Prevention. Chronic diseases and health promotion. http://www.cdc.gov/chronicdisease/overview/index.htm. Accessed August 8, 2013.
  3. US Department of Health and Human Services. 2008 Physical activity guidelines for Americans. Hyattsville (MD): US Department of Health and Human Services; 2008. http://health.gov/paguidelines/pdf/paguide.pdf. Accessed April 7, 2016.
  4. Centers for Disease Control and Prevention. National Center for Health Statistics. Exercise or Physical Activity. http://www.cdc.gov/nchs/fastats/exercise.htm. Accessed November 20, 2016.
  5. McKenzie TL, Cohen DA, Sehgal A, Williamson S, Golinelli D. System for Observing Play and Recreation in Communities (SOPARC): reliability and feasibility measures. J Phys Act Health 2006;3(s1, Suppl 1):S208–22. CrossRef PubMed
  6. Kaczynski AT, Potwarka LR, Saelens BE. Association of park size, distance, and features with physical activity in neighborhood parks. Am J Public Health 2008;98(8):1451–6. CrossRefPubMed
  7. Kaczynski AT, Koohsari MJ, Stanis SAW, Bergstrom R, Sugiyama T. Association of street connectivity and road traffic speed with park usage and park-based physical activity. Am J Health Promot 2014;28(3):197–203. CrossRef PubMed
  8. Banda JA, Wilcox S, Colabianchi N, Hooker SP, Kaczynski AT, Hussey J. The associations between park environments and park use in southern US communities. J Rural Health 2014;30(4):369–78. CrossRef PubMed
  9. Kaczynski AT, Henderson KA. Environmental correlates of physical activity: a review of evidence about parks and recreation. Leis Sci 2007;29(4):315–54. CrossRef
  10. Joseph RP, Maddock JE. Observational park-based physical activity studies: a systematic review of the literature. Prev Med 2016;89:257–77. CrossRef PubMed
  11. Bedimo-Rung AL, Gustat J, Tompkins BJ, Rice J, Thomson J. Development of a direct observation instrument to measure environmental characteristics of parks for physical activity. J Phys Act Health 2006;3(s1):S176–89. CrossRef
  12. Kaczynski AT, Stanis SA, Besenyi GM. Development and testing of a community stakeholder park audit tool. Am J Prev Med 2012;42(3):242–9. CrossRef PubMed
  13. Saelens BE, Frank LD, Auffrey C, Whitaker RC, Burdette HL, Colabianchi N. Measuring physical environments of parks and playgrounds: EAPRS instrument development and inter-rater reliability. J Phys Act Health 2006;3(s1):S190–207. CrossRef
  14. Lee RE, Booth KM, Reese-Smith JY, Regan G, Howard HH. The Physical Activity Resource Assessment (PARA) instrument: evaluating features, amenities and incivilities of physical activity resources in urban neighborhoods. Int J Behav Nutr Phys Act 2005;2(1):13. CrossRef PubMed
  15. Troped PJ, Cromley EK, Fragala MS, Melly SJ, Hasbrouch HH, Gortmaker SL, et al. Development and reliability and validity testing of an audit tool for trail/path characteristics: the Path Environment Audit Tool (PEAT). J Phys Act Health 2006;3(s1, Suppl 1):17. CrossRef
  16. Yousefian A, Hennessy E, Umstattd MR, Economos CD, Hallam JS, Hyatt RR, et al. Development of the Rural Active Living Assessment Tools: measuring rural environments. Prev Med 2010;50(Suppl 1):S86–92. CrossRef PubMed
  17. Broomhall M, Giles-Corti B, Lange A. Quality of Public Open Space Tool (POST). Perth (AU): School of Population Health, The University of Western Australia; 2004.
  18. The University of Western Australia School of Earth and Environment. POST reliability results; 2003. http://www.see.uwa.edu.au/research/cbeh/projects/?a=411952. Accessed April 7, 2016.

No hay comentarios: