WHAT IS BASEline?

BASEline is BASE Education’s assessment tool created to measure student growth.

BASEline is a full-scale research-validated assessment to measure eight dynamic SEL outcomes.

This highly anticipated and groundbreaking assessment is a self-report measure with the ability to track changes in users within eight core competencies. Student responses are compiled into user friendly and easy to read reports for school staff and administrators.

Why BASEline?

For Educators: To provide staff and administrators with the ability to track and report on the outcomes of the BASE program. The outcomes reported can support partners in honing in on areas of student need or strength.
For Students: To provide students with insight into their personal growth in the areas of behavior, truancy, engagement, and more. By giving them self-reporting capabilities, students are given the autonomy and empowerment to own their growth and achievement..
For the BASE Team: To provide BASE partners with the ability to track and report on outcomes. Additionally, this assessment will allow the BASE team to track the ongoing effectiveness of the content and services which will support product improvement.

BASEline Structure and Design

BASEline is a research-backed assessment created by our team of researchers. By listening to teachers and BASE partners, BASEline was developed to assess the most requested educational factors. For continuity and ease of use, BASEline is seamlessly engineered into the pre-existing BASE software platform.

To create BASEline, our research team synthesized the latest in educational literature and psychometrics to develop an inclusive set of questions designed to evaluate eight main measures, while remaining aligned with CASEL (The Collaborative for Academic, Social, and Emotional Learning) competencies.

BASEline’s questions are structured to measure all five of CASEL’s main competency groups https://casel.org/core-competencies/. Questions are sorted by the main CASEL competency addressed, for greater impact, many questions assess multiple CASEL competencies.

Questions are sorted into the following categories:

BASEline Reports

Twice per year, users of BASEline will receive a customized report summarizing your schools’ progress on each of the eight measures.

FAQ

Q: Are student answers anonymous?
A: Yes! Only one staff member at BASE Education (lead engineer) sees student’s raw data, due to an educational systems necessity to ensure proper data collection methodology. All other BASE Education research team members only access deidentified data, which is compliant to FERPA’s use and re-disclosure limitations.

Q: Why do students take BASEline multiple times?
A: In order to keep up with the best research practices, BASE Education uses a pre/posttest assessment format. Students will take the same assessment (written verbatim) multiple times throughout the year to ensure test validity and reliability. By taking BASEline multiple times, we will gather multiple data points for each student to measure growth over time.

Q: Can I see my students’ raw data?
A: No. In order to protect student privacy and remain compliant, only one person (lead engineer at BASE Education) is permitted to see student raw data.

For more information about BASEline, contact us at:
support@base.education

REFERENCES

BASEline incorporates research and findings from the following publications:

Allen, J., Wright, S., Cranston, N., Watson, J., Beswick, K., & Hay, I. (2018). Raising levels of
school engagement and retention in rural, regional and disadvantaged areas: Is is a lost
cause? International Journal of Inclusive Education, 22(4), 409-425.
Anderson, G., Whipple, A., & Jimerson, S. (2003). Grade retention: Achievement and mental
health outcomes. The Center for Development and Learning.
Bessenyey, K. (2000). Risk assessment instruments: Can we predict recidivism?: A review of the
research. Graduate Student Theses, Dissertations, & Professional Papers, 9001.
Retrieved from https://scholarworks.um,t.edu/etd/9001
Bradshaw, C., Waasdorp, T., Dednam, K., & Johnson, S. (2014). Measuring school climate in
high schools: A focus on safety, engagement, and the environment. Journal of School
Health, 84(9), 593-604.
Brundage, A., Castillo, J., & Moulton, S. (2017). Reasons for Chronic Absenteeism (RCA).
Florida’s Problem Solving/Response to Intervention Project, University of South Florida.
CASEL Guide Effective Social and Emotional Learning Programs. (2015). Retrieved May 20,
2019, from http://secondaryguide.casel.org/
Chronic Absenteeism. (2019). The Education Trust. Washington, DC. Retrieved from

Chronic Absenteeism


Cole, J., Rocconi, L., & Gonyea, R. (2012) Accuracy of self-reported grades: Implications for
research. Retrieved from
http://cpr.indiana.edu/uploads/2012_AIR_Cole-Rocconi-Gonyea.pdf
Dressel, J. & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism.
Science Advances, 4, 1-5.
Ginsburg, A., Jordan, P., & Chang, H. (2014). Absences add up: How school attendance
influences student success. Washington, DC: Attendance Works. Retrieved June 24, 2019,
retrieved from http://www.attendanceworks.org/wordpress/wp-content/
uploads/2014/09/Absenses-Add-Up_September-3rd-2014.pdf.
Hart, S., Stewart, K., & Jimerson, S. (2011). The student engagement in schools questionnaire
(SESQ) and the teacher engagement report form-new (TERF-N): Examining the
preliminary evidence. Contemporary School Psychology, 11, 67-79.
Hausam, J., Lehmann, R., & Dahle, K.P. (2018). Predicting offenders’ institutional misconduct
and recidivism: The utility of behavioral ratings by prison officers. Frontiers in
Psychiatry, 9.
Havik, T., Bru, E., & Eresvåg, S. (2014). Assessing reasons for school non-attendance.
Scandinavian Journal of Educational Research.
Heilbrunn, J., & McGillivary, H. (2005). How to evaluate your truancy reduction program.
National Center for School Engagement. Denver, CO.
Herrera, C., DuBois, D., & Grossman, J. (2013). The Role of Risk: Mentoring Experiences and
Outcomes for Youth with Varying Risk Profiles. New York, NY: A Public/Private Ventures
project distributed by MDRC.
Kuncel, N. R., Crede, M., & Thomas, L. L. (2005). The validity of self-reported grade point
averages, class ranks, and test scores: A meta-analysis and review of the literature.
Review of Educational Research, 75(1), 63-82.
Lippman, L. & Rivers, A. (2008). Assessing School Engagement: A Guide for Out-of-School
Time Program Practitioners. Research-to-Results Child Trends Publication #2008-39.
Washington, DC: Child Trends.
Loza, W. & Green, K. (2003). The self-appraisal questionnaire: A self-report measure for
predicting recidivism versus clinician-administered measures: A 5-year follow-up study.
Journal of Interpersonal Violence, 18(7), 781-797.
Miller, F., Johnson, A., Yu, H., Chafouleas, S., McCoach, D., Riley-TIllman, T., Fabiano, G., &
Welsh, M. (2018). Method matters: A multi-trait multi-method analysis of student
behavior. Journal of School Psychology, 68, 53-72.
Nafekh, M. & Motiuk, L. (2002). The statistical information on recidivism – revised 1 (SIR-RI)
scale: A psychometric examination. Research Branch Correctional Service of Canada.
Napier Press Sociology. (2014). Measuring Truancy.
National Forum on Education Statistics. (2018). Forum Guide to Collecting and Using
Attendance Data (NFES 2017-007). U.S. Department of Education. Washington, DC:
National Center for Education Statistics.
National Forum on Education Statistics. (2009). Every school day counts: The Forum guide to
collecting and using attendance data (NFES No. 2009–804). National Center for
Edu­cation Statistics Working Paper. Washington, DC: U.S. Department of Education.
O’Gorman, E., Salmon, N., & Murphy, C. (2016). Schools as sanctuaries: A systematic review of
contextual factors which contribute to student retention in alternative education.
International Journal of Inclusive Education, 20(5), 536-551.
Reynolds, C. R., & Kamphaus, R. W. (2015). Behavior assessment system for children (3rd ed.).
Minneapolis, MN: Pearson.
Rogers, T., Duncan, T., Wolford, T., Ternovski, J., Subramany, S., & Reitano, A. (2017). A
randomized experiment using absenteeism information to “nudge” attendance. National
Center for Education Evaluation and Regional Assistance.
Rosen, J., Porter, S., & Rogers, J. (2017). Understanding student self-reports of academic
performances and course-taking behavior. American Educational Research Association,
3(2), 1-14.
Rumberger, R., & Gottfried, M. (2016). Not All School Attendance Data are Created Equal.
Education Week, 35(34), 27.
Sanchez, E., & Buddin, R. (2016). How accurate are self-reported high school courses, course
grades, and grade point averages? [PDF Document]. Retrieved from
http://www.act.org/content/dam/act/unsecured/documents/5269-research-report-how-ac
urate-are-self-reported-hs-courses.pdf
Sticca, F., Goetz, T., Bieg, M., Hall, N., Eberle, F., & Haag, L. (2017). Examining the accuracy
of students’ self-reported academic grades from a correlational and a discrepancy
perspective: Evidence from a longitudinal study. PLoS ONE 12(11). Retrieved from
http://doi.org/10.1371/journal.pone.0187367
Sun, R. & Shek, D. (2012). Student classroom misbehavior: An exploratory study based on
teachers’ perceptions. The Scientific World Journal, 1-8.
Tingle, L., Schoeneberger, J., & Algozzine, B. (2012). Does grade retention make a difference?
The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 85(5),
179-185.
Warren, J., Hoffman, E., & Andrews, M. (2014). Patterns and trends in grade retention rates in
the United States, 1995-2010. Educational Researcher, 43(9), 433-443.
Yang, M., Chen, Z., Rhodes, J., & Orooji, M. (2018). A longitudinal study on risk factors of
grade retention among elementary school students using a multilevel analysis: Focusing
on material hardship and lack of school engagement. Children and Youth Services
Review, 88, 25-32.
Zimmerman, M., Caldwell, C., & Bernat, D. (2002). Discrepancy between self-report and
school-record grade point average: Correlates with psychosocial outcomes among
African American adolescents. Journal of Applied Social Psychology, 32(1), 86-109.
6 Ways to Collect Data on Your Students’ Behavior. (2018). Retrieved May 27, 2019, from
blog.brookespublishing.com/6-ways-to-collect-data-on-your-students-behavior/