Beyond data quality: What else is planned for STARS?
On behalf of the STARS team, I want to first extend a big thank you to Julian Dautremont-Smith for initiating the conversation on STARS data quality and helping to spark some really wonderful discussion and debate on the Green Schools listserve! AASHE staff have been following the dialogue and we’re looking forward to incorporating your feedback as we work toward implementing STARS data accuracy solutions. We’re especially excited to continue the discussion in-person at the AASHE 2014 Conference & Expo happening in just a few days!
Some of the discussion on the Green Schools data accuracy thread has prompted questions into other areas within STARS, so AASHE staff and the STARS Steering Committee want to take this opportunity to share what else we are thinking about (and hopefully we can have discussions about some of these other areas at the conference as well.)
AASHE has just finished analyzing responses from the STARS 2014 Survey, which requested feedback last summer about (among other things): making STARS more widely used and accessible, improving the process of completing a report, and making STARS data most useful for institutions. Over 200 of you took the survey and the feedback and comments have been very insightful! Some of the top themes coming out of the survey included:
- Enabling better STARS data comparison with other institutions
- Continuous improvement of STARS credits through technical development
- Reducing the complexity of STARS
- Providing a more useful and engaging summary of STARS reports
- Reducing the cost of STARS and making it more accessible to resource-strapped institutions
With these findings in mind, and along with the ongoing discussion on data quality, here are some of the other things that the AASHE staff and STARS Steering Committee are working on and thinking about:
Streamlining the process
We’ve heard loud and clear that it takes too much time to participate and that is a major reason institutions either aren’t participating or are submitting only every two or three years instead of annually. To address this in our technical development process for 2.1 and beyond, we are looking at removing or consolidating credits, reducing unnecessary reporting fields, and improving the data collection submission process (for example, by creating spreadsheets for each subcategory that institutions could use to collect data from campus stakeholders). An even bigger idea that we’ve just starting to think about would be to identify a small group of “core” credits and allow institutions to earn some form of recognition for completing only those credits.
We’ve received feedback that the layout of an institution’s final STARS report is not very engaging and does little to encourage members of the campus community to get excited about the findings and share this info with students, top administration, and others. To change this, we’re looking at ways to improve the front page of each STARS submission to be more visually appealing and less credit-focused. The goal is to make STARS reports more approachable for prospective students and others who aren’t intimately familiar with STARS. One key feature of this project includes providing a summary comparison of how a particular institution is doing in each subcategory (e.g. Research, Water, Investment, etc.) in comparison to the average for similar institutions.
We’re also continuously thinking about how best to promote STARS ratings through publications like the STARS Annual Review and other content, with the goal of getting media, top administration, and current and prospective students more excited about STARS. Something we’re considering that also relates to the data quality issue is the idea of offering provisional ratings that are finalized after a submission review (and possibly making a big splash announcing final ratings in a publication).
Enhancing data displays
Access to data displays that allow for comparison and benchmarking between institutions is currently limited, and we’ve not yet incorporated version 2.0 data into the displays. AASHE staff are working to revamp the current data displays to give participants greater ability to easily compare and benchmark with other institutions. This would also help in spotting questionable quantitative data that could easily be missed in the more qualitative submission reviews that AASHE currently conducts. As we move forward on this project, we look forward to hearing from participants at the conference and through other communication channels about their ideas for improving data comparison.
Changing the business model to remove disincentives for annual participation
Currently, a STARS rating lasts up to three years. Institutions are welcome to resubmit more frequently (and some do) but many only submit every two or three years because it can be cheaper that way. Some participants even do all of the work of entering updated data into the STARS reporting system to submit to Sierra Magazine and The Princeton Review, but choose not to submit this same data to STARS because it would cost them money and require a new President’s letter and Innovation credits. The information in the STARS database is less complete and less up-to-date as a result. We’re considering other pricing options that would split the cost of participating over 3 years and remove the financial penalty for more frequent reporting. As a side benefit, this should make STARS costs easier to budget for participants (e.g., through a smaller annual cost instead of a large cost every 2 or 3 years). Another idea that has been floated is to potentially allow each executive letter and Innovation credit to remain valid for a full three years, so participants wouldn’t necessarily have to replace them every submission. We haven’t put forward any formal proposals on these topics yet, so it would be great to get feedback from participants about whether the current model is meeting your needs and if alternatives should be considered.
In sharing these plans and ideas with you before the conference, we’re hoping to provide a more complete picture of the things the staff and Steering Committee are considering in conjunction with improving STARS data quality. We’re excited to discuss some of these topics with you more in-depth in Portland! Here are the STARS sessions at the Conference where you can learn more about STARS and connect with AASHE staff and STARS Steering Committee members:
- STARS Introductory Workshop - a 3.5 hour session for individuals that are new to STARS and are seeking a better understanding of how the system works. Space is still available but limited (register). Sunday, 10/26, 8:30am-12pm.
- STARS Advanced Workshop - a 3.5 hour session for individuals that would like to engage in high-level discussion about STARS, with a focus on promoting data quality in submissions. Space is still available but is limited (register). Sunday, 10/26, 1-4:30pm.
- STARS 101: Everything you Need to Know - Part of the regular conference schedule. Individuals attending this 1-hour session will hear first-hand how institutions are implementing STARS on their campuses. Monday, 10/27, 10:45-11:45am.
- STARS Town Hall - Part of the regular conference schedule. This 1-hour session features a small panel of STARS Steering Committee members, Technical Advisors, and AASHE staff who will facilitate discussion on the most current issues being tackled to further improve STARS. Tuesday, 10/28, 1-2pm.
Thanks again and we look forward to connecting with you soon!
Browse by Topic
- AASHE Biz
- Co-Curricular Education
- Community Engagement
- Connecting the Dots
- Dining Services
- Diversity, Access, and Affordability
- Faculty and Staff Development
- Government & Legislation
- Human Resources
- Presidents & Chancellors