Show Summary Details

Page of

PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).

Subscriber: null; date: 19 January 2019

The Status of Arts Assessment in the United States

Abstract and Keywords

Based on data gathered from members of SEADAE, the State Education Agency Directors of Arts Education, the authors report on current priorities and practices in dance, media arts, music, theatre, and visual arts assessment in states across the nation and in Department of Defense schools around the world. With the 2014 publication of the National Core Arts Standards and the then-impending replacement of No Child Left Behind (NCLB) by the Every Student Succeeds Act of 2015 (ESSA), it became clear that conditions and resources at national and state levels had undergone significant change since the completion of the 2008 SEADAE study of arts assessment practices. New questions relative to current policy and practice needed to be addressed in order to inform the approach to and development of state and local assessment in the arts, the outcomes of which must inform and raise the quality of instruction in today’s arts classrooms.

Keywords: arts assessment, SEADAE, NCLB, ESSA, drivers of assessment, NEA, National Endowment for the Arts, learning standards, arts standards

Marcia McCaffrey and Linda Tracy Lovins

Introduction and Parameters

This study was undertaken by SEADAE (State Education Agency Directors of Arts Education, 2012, 2016), the professional organization for directors of arts education (DAEs) at state departments of education (SEAs) in the United States whose responsibility it is to oversee and manage state-level work related to arts education. Incorporated in 2005 and granted 501(c)3 status in 2006, the organization maintains a nationwide infrastructure of DAEs in state departments of education. SEADAE membership is necessarily fluid and—at the time of this writing—consisted of 45 states and the Department of Defense Education Activity (DoDEA), which manages schools around the world to serve the children of military service members and Department of Defense civilian employees.

Comprising education specialists in dance, media arts, music, theatre, and the visual arts, SEADAE members typically have educational expertise in one or more disciplines. However, DAEs’ state responsibilities necessitate egalitarian attention to all arts disciplines. Consequently, this study on arts assessment is somewhat unique to this publication: rather than addressing music singularly, it is inclusive of all arts disciplines.

The 2008 Study

SEADAE was awarded a grant by the US National Endowment for the Arts (NEA) in fiscal year 2008. Its focus was on “gathering data relevant to arts assessment and associated (p. 58) professional development (PD) in member states across the nation” (Lovins, 2010, p. 23). Although some SEAs lacked an arts education specialist at the time of data collection, 37 states were represented in this project, working collaboratively within regions under the leadership of their respective regional representatives.

Research Questions

To provide alignment across the regions, five research questions were designed to guide SEADAE’s work in 2008 (for the full report, see Lovins, 2010). Professional development in arts assessment for DAEs and arts educators was of great concern, as was the need for documenting the status quo of arts assessment and related training across the states. The research questions guiding the 2008 study were:

  1. 1. What are the arts assessment professional development needs and priorities of state education arts directors and their partners?

  2. 2. What are the promising practices, resources, and current tools for arts assessment being used in the states?

  3. 3. How have arts education directors and their partners been engaged in professional development related to assessment in the arts?

  4. 4. Did the regions provide one or more professional development opportunities that addressed the region’s needs and priorities?

  5. 5. Was the level of regional knowledge and repertoire of arts assessment strategies among arts education directors at state education agencies, their partners, arts, education, and teacher leaders improved?

The promising practices, resources, and extant tools for arts assessment in the states ranged from hard-copy documentation to information provided on SEADAE’s and other websites. In addition, there were examples of assessment programs, publications, individuals, and organizations available for assistance.

Several states were involved in sophisticated assessment activities, while others were less so at the state level. In no way did the data suggest that some states were ahead of or behind their counterparts across the nation; philosophy, policy, economics, and simple geography each played a major role in determining the degree to which states were, and chose to be, involved with arts assessment.

Several states’ DAEs had participated in SCASS Arts, one of the State Collaborative on Assessment and Student Standards (SCASS) projects of the Council of Chief State School Officers (CCSSO). Membership in SCASS Arts represented an investment by SEAs, so it was expected that member agencies would have the necessary support for provision of assessment tools, strategies, and professional development for their constituents.

State policy and adoption of new or revised arts standards for learning often implied or included direct mandates for assessment. These were generally expected to be local, classroom assessments that addressed the degree to which students met a given state’s learning standards in an arts discipline; if a state mandated arts assessment, however, the means of implementing the assessment was left to schools and districts. Policy and/or state law also occasionally required schools and districts to report assessment (p. 59) results for their annual Report Cards, but the data sought was confined to “high-stakes” content areas, to the exclusion of the arts, world languages, physical education, health, and others.

Most SEAs worked extensively to provide assessment models and tools; the degree to which DAEs could do so in the arts varied from state to state, based on policy and mandates. The data indicated there was a clear need to continue partnership development with postsecondary institutions for both in-service and preservice teachers; more than one region noted a dearth of graduate and undergraduate courses on arts assessment, either discipline-specific or integrated.

High-quality assessment to inform both classroom practice and educational policy was deemed critical. As schools, districts, and states arrived at an intersection forged by (1) the accountability requirements of the Elementary and Secondary Education Act (ESEA), known at its 2001 signing into law as No Child Left Behind (NCLB); (2) varying philosophies of arts education and assessment in the arts; and (3) an increasingly challenging economy, it was suggested that state leadership would be critical to successfully positioning arts assessment at a higher level in the overall educational scheme.

SEADAE: The Organization Behind the Research

To understand the complexity of the work the 2015–2016 study endeavors to describe, it is important to understand the organization behind the research, particularly the parameters within which its members work. SEADAE’s mission is to “support the professional effectiveness of individual members and provide a collective voice for leadership on issues affecting arts education” (SEADAE, 2012). The organization’s vision is for an educational system in which all PreK–20 students gain artistic literacy through equitable access to quality, sequential, standards-based learning opportunities in dance, media arts, music, theatre, and the visual arts. SEADAE’s work is supported by individual member contributions, project-specific agency contributions, and grants from the NEA, Hewlett Foundation, National Association of Music Merchants, Educational Theatre Association, and National Association for Music Education.

Collectively, SEADAE relies on members’ expertise specific to the arts and other areas of education, such as federal ESEA Title programs or assessment literacy, test development, and item analysis. SEADAE further strengthens members’ and its own organizational capacity by seeking expertise from its members and national partners, which serves the organization, its partners, and arts education well.

Because of its state-based roots and collective national footprint, SEADAE’s members and elected leadership concurrently interact with key stakeholders in the arts education community at local, state, and national levels. Uniquely positioned at the nexus between local and national perspectives, SEADAE members have developed a comprehensive overview and understanding of the American arts education ecosystem, enabling SEADAE to provide leadership at local, state, and national levels and provide a critically important voice for empowering change in arts education.

(p. 60) SEADAE strives to leverage the collective knowledge and experience of its membership to build capacity across the organization, creating a community of experts with common beliefs about the provision of a quality arts education through our nation’s public schools. Primary among these beliefs is that all arts disciplines are of equal value: dance, media arts, music, theatre, and the visual arts are the disciplines that collectively constitute “the arts,” and the unique capacities of each discipline resonate in important, yet often different ways with individual students. Therefore, each arts discipline is recognized as part of a well-rounded education for all students. While not every state has policies that reflect this parity among arts disciplines, SEADAE members endeavor to represent each arts discipline on equal terms, regardless of personal expertise or confines of state policy. This absence of bias is considered a substantive asset by the organization and its partners.

In the United States, each state has an obligation to fulfill federal legislative mandates signed into law through the ESEA, known at its December 2015 signing as the Every Student Succeeds Act, or ESSA (National Association for Music Education, 2016). States are further obligated by legislative mandates established through the laws and regulations within their own boundaries. These state and federal requirements, working in tandem, drive policy decisions at the state level. Therefore, SEADAE members’ work is at once similar and different: similar because of in-common federal education legislation, and different because of the variances among states’ regulations. The US Constitution, US Bill of Rights, 10th Amendment, limits the authority of the federal government, whereas, “The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people” (National Archives, n.d., Article the 12th). In the US Constitution, education is omitted as a federal power; therefore, the states hold primary authority and responsibility for its provision.

Due to these two policy drivers, SEADAE members work in federal and state spheres to serve educators and their students at the local level, and their portions of work designated as state- or federally based varies from person to person. Regardless of the variances created by states operating under their own laws, much commonality exists among members’ SEA assignments, making clear the value of members’ propensity for sharing resources, tools, and approaches to work, including analyses and interpretations of federal and state policies, practices, resources, and initiatives.

Directors of arts education commonly make policy recommendations that form the basis for what public schools are expected to provide for students regarding opportunities to learn in arts education; support local districts and educators as they strive to meet state and federal regulations and mandates; and review and analyze data from a variety of sources to identify and track changes in the status and condition of arts education in their state. They also inform constituents of model practices and trends that define current or foreshadow future educational practices in the arts; manage the state adoption process for student achievement standards in the arts; and develop and implement quality assessment tools and processes associated with state achievement standards in the arts.

(p. 61) One of SEADAE’s most significant assets is its ability to collect information from members in efficient and timely ways. Members value the views gathered from within the organization and prioritize responding to the many requests for information. The data for the 2015–2016 study on the status of arts assessment across the United States was collected by surveying all DAEs, sorted into categorical trends, and reported out similarly.

Development of the 2014 National Core Arts Standards

The SEADAE played a major role in the initiation and development of the 2014 National Core Arts Standards (National Coalition for Core Arts Standards, 2014). In 2010, SEADAE convened more than 50 national partners to pursue reimagined national achievement standards in arts education (National Coalition for Core Arts Standards, 2016). Two main reasons drove this action: the first was political; the second had to do with “shelf-life.”

The 2008 election of US President Obama ushered in new federal policies and philosophical positions that drove decision-making at the state level. One such initiative, the federal Race to the Top (RttT) grant, was applied for by many states and seen as a way to bring much-needed resources to states and schools whose funding was greatly challenged by the financial crisis of 2008. Race to the Top was “designed to spur systemic reform and embrace innovative approaches to teaching and learning in America’s schools. Backed by an historic $4.35 billion investment, the reforms contained in the Race to the Top grant will help prepare America’s students to graduate ready for college and career, and enable them to out-compete any worker, anywhere in the world” (US Department of Education, 2009).

The grant required states to adopt rigorous standards for student achievement. Most states’ grant applications confirmed their commitment to the newly minted Common Core State Standards in English language arts and mathematics. SEADAE members believed that, in the shifting sands of this educational landscape, arts educational opportunities for students would be significantly overshadowed if updated and more rigorous standards in the arts were not pursued. SEADAE saw this as a time to rally partners toward creating a new set of voluntary national arts standards, and pressed for action.

In the United States, student academic standards have typically been developed by a group of like-minded individuals and organizations at the national (not federal) level, vetted and adopted at the state level, and implemented at the local level. The first voluntary national arts standards in the United States, the National Standards for Arts Education, were developed by a national consortium of arts leaders in 1994 (Consortium of National Arts Education Associations, 1994). In less than a decade, 49 of 50 states developed and adopted their own state arts standards. Some states’ standards were essentially the same as the 1994 national model; others made significant adaptations; and still others created arts standards that were quite different from the 1994 National Standards for Arts Education.

Because education is state-centric, state policy has generally emphasized state-generated standards. Some also mandate revisions on a time-based cycle, while others have no such mandates. Over time, early adopters of the 1994 standards updated their state documents once or, sometimes, twice. Some states gravitated toward the artistic-process (p. 62) model first released in the 1994 assessment framework for the 1997 National Assessment of Educational Progress in the Arts (Arts NAEP). Conversely, other states made no changes to their standards documents at all. As time went on, states’ standards became increasingly more disparate and individualized in the articulation of student learning in the arts. This, in turn, led to deep variances in the goals and objectives of standards-based arts education in schools and classrooms across the nation.

While every state has the right to develop its own standards, there are benefits to establishing a degree of alignment among states by building state standards from a single, national model. A distinct benefit of aligned standards for educators is the resulting increase in well-vetted, research-based resources designed under the aegis of national arts education organizations and other respected partners. Alignment to national standards also increases availability of high-quality instructional and assessment models, as well, providing significant savings in time and money for states, districts, and schools. Alignment is also beneficial to students, who often experience educational disjointedness in the current mobile society, particularly when they converge in higher education arts programs.

As the 1994 National Standards for Arts Education reached their 15th anniversary in 2009, coupled with new initiatives from the US Department of Education, SEADAE members were becoming more outspoken in what they saw as a critical need in arts education: new national standards. Once other stakeholders in arts education agreed that reimagining national arts standards was a priority, a structure and timeline for developing new standards emerged and a leading group of change-agents coalesced to form the National Coalition for Core Arts Standards (NCCAS). Managed and facilitated by SEADAE, a Leadership Team was convened to create the standards framework and guidelines, provide research, and seek grant funding. The professional arts education organizations supported discipline-specific teams to lead and write the standards based on the framework, guidelines, and research provided by the Leadership Team (College Board, 2011a, 2011b, 2011c, 2012a, 2012b, 2012c, 2014).

At the time of this 2015–2016 study, 14 states had already adopted new state standards informed by the 2014 National Core Arts Standards, with several more states in the process of updating theirs using the new standards as the foremost model. In a standards-based educational system, measuring student learning against those standards is key; standards and assessment go hand-in-hand.

Model Cornerstone Assessments

With the 2014 release of the National Core Arts Standards, the NCCAS Leadership Team made a point of simultaneously releasing Model Cornerstone Assessments (MCAs) for each arts discipline at selected grade levels. By doing so, they highlighted the standards’ measurability, promoted performance-based tasks as a leading form of best-practice assessment, and underscored the integral relationship of standards and assessments.

In 2015 and 2016, SEADAE received two NEA grants on behalf of NCCAS to support a national pilot and benchmarking process for MCAs; the first was awarded for elementary MCAs, and the second for high school MCAs. Based on findings from the (p. 63) pilots, the professional arts education organizations will revise and refine the MCAs to provide research-based models for assessing student learning against the National Core Arts Standards.

SEADAE has pursued additional investments in arts assessment. Through a project grant from the NEA, SEADAE and Young Audiences partnered for a National Arts Assessment Institute in Chevy Chase, Maryland (July 2014). The 75 attendees immersed themselves in “unpacking” the MCAs of the then-soon-to-be-released National Core Arts Standards and learning about arts assessment across the nation, including work from Colorado, Connecticut, Delaware, Florida, Michigan, Pennsylvania, South Carolina, and Tennessee.

The 2015–2016 Study

With the 2014 publication of the National Core Arts Standardsand the impending replacement of NCLB by ESSA in 2015, it became clear that conditions and resources at national and state levels had undergone significant change since the completion of the 2008 SEADAE study of arts assessment practices. Rather than replicate the 2008 study, new questions needed to be designed relative to current policy and practice.

Research Questions: 2015–2016 Study

The questions around which the 2015–2016 study was designed probed for greater depth and detail than previously sought in 2008, endeavoring to provide information about what is being done in arts assessment, who is doing it, how, and why. The questions on which the subsequent survey was designed were as follows:

  1. 1. What drives arts assessment?

  2. 2. What priorities are associated with arts assessment, and how are states addressing them?

  3. 3. Who is engaged in arts assessment and in what capacity?

  4. 4. What (professional development) needs exist and how are they being addressed?

  5. 5. How is technology impacting arts assessment?

  6. 6. How are arts assessment results used?

Separation of State and Local Control

As the study progressed, the need to ascertain which states considered themselves “local control” states arose. Emailed responses from SEADAE members within a few hours of posing the question revealed that all states considered themselves “local control” states in regard to educational practice.

(p. 64) High-stakes decisions in education are generally made through state legislatures and/or state boards of education. “Local-control” states relinquish control over a broad range of related decisions to district boards of education and school administrators. The following provides a clear explanation of the separation between local and state activity:

In education, local control refers to (1) the governing and management of public schools by elected or appointed representatives serving on governing bodies, such as school boards or school committees, that are located in the communities served by the schools, and (2) the degree to which local leaders, institutions, and governing bodies can make independent or autonomous decisions about the governance and operation of public schools

(Great Schools Partnership, 2016).

While an arts education may be required by state law, local boards and administrators determine how the law is addressed, including what will be offered and the degree of funding with which it will be supported.

For purposes of this study, a number of assessments were excluded. Although addressed as a facet of arts assessment activity, program evaluation was not investigated as a separate entity, nor was any activity associated with the Arts NAEP or the National Board for Professional Teaching Standards (NBPTS).

Method of Data Collection

SEADAE members are located across the nation and work with their peers primarily through virtual meetings. Due to time constraints, limited funding, and each SEA’s priorities, face-to-face meetings of SEADAE members are rare, even within regions. Therefore, a digital survey was designed within the framework of the research questions.

Refining and Preparing the Survey Items

Once potential survey items were written, feedback was sought from several DAEs in states of varying sizes across the nation. All DAEs selected for this purpose had considerable experience in their state roles and in research. Once the survey questions were refined, based on their input, an overview of the anticipated survey was made available to all DAEs via a virtual meeting, and a recording was digitally archived for on-demand member use.

The survey questions were refined once again, and the resulting online survey was launched. Follow-up to nonrespondents and to respondents whose surveys were incomplete was provided via e-mail and phone, and the window of time for completing the survey was extended to accommodate DAEs’ work flow to every extent possible.

Other Considerations

All items and variables within the items required a response in order to proceed, with the exception of an optional “Other (please specify)” at the end of each item. This allowed respondents to add or expand on information not allowed for within the variables.

(p. 65) Because assessment language varies widely across states, a glossary was provided to help align responses to variables within the survey. The glossary and a sample of the survey were posted on SEADAE’s wiki for respondents’ in-process use.

Several respondents to this survey are or have been engaged in arts assessment initiatives at multistate or multinational levels, knowledge of which could have influenced their responses. To focus respondents’ feedback solely on arts assessment activity within the confines of their immediate areas of responsibility, each survey item included “in your state.”

While some knowledge of school and district arts assessment activity and resources is available to DAEs, such information is often anecdotal in nature. Because DAEs work in “local-control” states, it is difficult to provide detailed data regarding specific activities occurring solely at school and district levels. For purposes of this survey, respondents were asked to offer their “expert opinion and professional judgment … to provide the most accurate, representative responses possible” and were respectfully accorded the option of selecting Unknown in response to several variables.

Three preliminary questions were devised to provide context for data analysis. These questions identified respondents’ geographical area of responsibility and SEADAE region, and asked whether respondents’ states had been awarded and made use of RttT funding for “hard-to-assess content areas.” To allow SEADAE members the ability to respond freely, more specific identifiers were avoided and an informal nondisclosure policy was adopted as a premise of seeking specific, sometimes politically sensitive information (see Figure 4.1).

The Status of Arts Assessment in the United StatesClick to view larger

Figure 4.1 The survey items, in research-question context.

Data Analysis

The data were analyzed holistically, then examined in subsets where such comparisons proved illuminating. The subsets most often used for comparison were based on “yes/no” responses related to the use of RttT funding: “Please indicate whether your state has received and made use of federal RttT funding for development of assessments in the arts.”

Both whole numbers and percentages were examined for clarity in presentation of data. It was determined that reporting data in percentages provided a clearer, more consistent picture of the results.

Findings

The survey was distributed to current DAEs or an appointed proxy, therefore the number of responses is significantly smaller than would have been expected from a survey of school, district, and state arts education stakeholders. Out of a total population of 46 member agencies/states, 42 responded and 40 completed the survey. Respondents included 38 SEA employees, the DAE from DoDEA, two SEA-appointed state arts council representatives to SEADAE, and a state arts council representative known to have extensive knowledge (p. 66) of a state for which data could not otherwise have been collected. This represented a response rate of 91% and a completion rate of 87%. Additionally, 6 states responded Yes to the RttT question (14%); there were 36 non-RttT states (86%), 34 of whom completed it. The data provided via the incomplete responses were included in the analysis, and all data were reported in percentages to accommodate for differing numbers of respondents to later items.

Question 1. What drives arts assessment?

Item 1a-  In your state, what drives arts assessment activity at the local level?

Item 1b-  In your state, what drives arts assessment activity at the state level?

In Item 1a, respondents were asked to indicate the strength of nine potential drivers on a basis of Always, Often, Sometimes, Rarely, and Never, as shown in Table 4.1. They could also choose Other and provide details of additional drivers in an associated text box. (p. 67) No definition of “activity” was provided, allowing respondents to consider classroom assessments, planning, professional development, and all other related work as activity.

Table 4.1 Nine Potential Drivers of Arts Assessment at the Local Level and the State Level

Item 1a—In your state, what drives arts assessment activity at the local level? Is it…

Item 1b—In your state, what drives arts assessment activity at the state level? Is it…

Always

Often

Sometimes

Rarely

Never

Unknown

a. Policy—Local?

b. Policy—State?

c. State initiative(s)?

d. Federal initiatives (e.g., waivers, Race to the Top)?

e. Desire for parity among all content areas?

f. Research-based instructional practice?

g. Educator evaluation/Teacher accountability for student learning?

h. School/District accountability for student learning?

i. Student accountability for learning?

j. Other (please specify)?

1a+/1b+ Please describe “Other” and/or elaborate on one or more responses in Question 1a above

As shown in Graph 4.1, 54.8% of all respondents indicated that educator evaluation/teacher accountability for student learning always or often drove arts assessment activity at the local level. Local policy (as determined by school administrators or district boards of education) was a close second (47.7%), followed by state policy (38.1%) and school/district accountability for student learning (35.7%). Student accountability for learning (30.9%) and state initiatives (28.6%) ranked next. Slightly more than one in five respondents indicated significant driver strength in research-based instruction or federal initiatives, such as waivers or RttT, and a desire for parity among all content areas (11.9%) ranked last among all listed drivers of local arts assessment activity.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.1 Responses to Item 1a, drivers of arts assessment at the local level, in order of selection frequency based on total of “Always” and “Often”

When examined in order of frequency (see Graph 4.2) for Always and Often, in total, it became clear that state activity in arts assessment activity was driven most often by state initiatives (52.4%) and policy (50.0%). These two drivers were followed closely by educator evaluation/teacher accountability for student learning (47.6%). While these three drivers were each identified as Always or Often by roughly half of all respondents, the remaining drivers identified as Always or Often dropped to a rate of one in three or (p. 68) fewer: federal initiatives (33.3%), research-based instructional practice (31.0%), school/district accountability for student learning (31.0%), and student accountability for learning (29.3%). The remaining two drivers were selected as Always or Often by approximately 1 in 5 respondents and 1 in 10 respondents, respectively: desire for parity among all content areas (21.5%) and local policy (9.6%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.2 Responses to Item 1b, drivers of arts assessment at the state level, in order of selection frequency based on total of “Always” and “Often”

When viewed in terms of whether a state received and used RttT funding, the results were significantly different for Items 1a and 1b. As shown in Graph 4.3, local arts assessment activity was driven, first and foremost, by educator evaluation/teacher accountability for learning, reported by 100% of all RttT respondents. The second-highest driver was the federal initiative, itself (83.3%). In contrast, fewer than half (p. 69) of all non-RttT respondents selected educator evaluation/teacher accountability for learning as Always or Often (47.2%), matched closely by local policy (44.5%). In further contrast to those funded by assessment-based RttT monies, the third-highest reported driver of local activity in non-RttT states was state policy (36.1%), which ranked sixth of nine for RttT states.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.3 Comparison of responses to Item 1a, drivers of arts assessment at the local level, in order of selection frequency based on total of “Always” and “Often” in the context of RttT funding for “hard-to-assess” content areas

As shown in Graph 4.4, high positives for drivers of state-level arts assessment activity were reported by respondents from RttT states, rating both federal initiatives and educator evaluation/teacher accountability for learning at 100%. In these states, state initiatives also ranked high (83.3%) for Always and Often. This was in contrast to responses from non-RttT states, which ranked both state initiatives and state policy highest (47.2%), but at significantly lower levels, followed by educator evaluation/teacher accountability for learning (38.9%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.4 Comparison of responses to Item 1b, drivers of arts assessment at the state level, in order of selection frequency based on total of “Always” and “Often” in the context of RttT funding for “hard-to-assess” content areas

Question 2. What priorities are associated with arts assessment, and how are states addressing them?

Item 2- In your state, to what degree are the following priorities associated with arts assessment, and how is your state addressing each?

Item 3- In your state, what type(s) of assessment best address the outcomes associated with your state’s arts standards?

Two survey items were designed to gather information about state priorities associated with arts assessment. The nine variables in Item 2 first required respondents to select (p. 70) High, Moderate, Low, or N/A to indicate the degree to which each potential priority was associated with arts assessment in the state. They were then asked to indicate, in a text box per variable, how their state was addressing each. The variables to which they were asked to respond were: teacher accountability; school/district accountability; program accountability/sustainability; statewide standardized assessment; district-wide common assessment; academic parity (e.g., among arts disciplines, among all content areas); advocacy (justifying relevance of the arts in today’s educational landscape); funding (e.g., availability, distribution); research or pilot program(s); and other (please specify). At the end of the variables, respondents were given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses.

Similar to the results of Item 1b, educator evaluation/teacher accountability for learning (51.2%) was identified as a high priority most often (see Graph 4.5). Following that was a significant drop in priorities identified as High, selected by only one in four respondents or fewer for school/district accountability for student learning (24.4%) and advocacy/justifying relevance of the arts in today’s educational landscape (22.0%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.5 Responses to Item 2, indicating the degree to which these priorities are strongly associated with arts assessment, in order of frequency

The nine variables in Item 2 required respondents to describe how their states were addressing each priority. Teacher accountability engendered 28 responses, ranging from “hot topic” to generic systems set up by the state for local use in all content areas, requiring adaptation to arts instruction. School/district accountability responses covered a variety of topics, including a focus on Common Core State Standards (rather than the arts), school improvement reporting, program reviews, and more. One state was described as (p. 71) having a “monitoring and evaluation system for public school districts. The system shifts the monitoring and evaluation focus from compliance to assistance, capacity-building, and improvement.”

Program accountability was largely identified as a matter of local control; in one case, student enrollment formed the basis for accountability. A few states require school and district reporting in the arts as part of their annual “report cards,” often to show compliance with state law/policy, unrelated to courses offered or student growth. In response to statewide standardized assessment, two respondents noted stakeholder discussions in progress for such assessments in the arts, and a third noted a non-SEA initiative for common arts assessments. The topic of district-wide common assessments prompted comments on several district initiatives, most often centered on end-of-course assessments for secondary students. Again, policy, design, and monitoring were cited as matters of local decision.

Academic parity among arts disciplines and among all content areas drew comments about local control and a statewide focus on Common Core State Standards and STEM (science, technology, engineering, and math). Content-area parity was most often described as supported through other, nonassessment contexts. One respondent described advocacy as follows: “Advocacy initiatives seem directed more around the benefits of arts education and the arts as part of a well-rounded education rather than using arts assessment results to justify arts in schools.” Other respondents, however, noted some use of arts assessment as a rallying point by statewide organizations.

A few respondents noted that funding was tied to specific initiatives or sources, such as adoption of the National Core Arts Standards or grant funding. By and large, however, funding was identified as a matter of local decision-making with no ties to arts assessment as a driver. One RttT respondent cited a state-provided platform and test-item bank for districts to use at no cost, but noted that not all districts take advantage of it.

(p. 72) Respondents cited several state and district-level research or pilot program(s). Links to specific programs and resources were made available by respondents in the optional “links to resources” item provided at the close of the survey.

In addition to the nine priorities, which were selected on the basis of researcher experience and anecdotal evidence, respondents could add priorities specific to their states and elaborate on them in Item 2+. This generated a significant amount of text and important anecdotal data, two-thirds of which offered descriptions of initiatives, projects, and more. One respondent cited a pilot project based on another state’s model, while a second respondent mentioned a new governor’s STEM task force as a potential opportunity for STEAM funding (i.e., STEM plus Arts). A teacher evaluation system was described as the “biggest driver of arts assessment” by one respondent, while another reported adoption of the National Core Arts Standards as a current, multiyear driver of activity inclusive of arts assessment. Yet another respondent wrote, “arts and arts assessment are being increasingly used as indicators of the quality of schools and arts programs, and as an indicator of an interest in whole child education.”

Under the presumption that development of state standards in the arts implies a level of priority, respondents were asked in Item 3 to identify the type(s) of assessment that “best address the outcomes associated with your state’s arts standards,” indicating the strength of benefit on a scale of 0–4, with 4 being the greatest. Data was analyzed based on assessment types rated as 4 or 3. The variables to which they responded were: assessment FOR learning; assessment OF learning; embedded assessment; performance assessment; portfolio; constructed response; selected response; common assessment; classroom assessment; competency-based assessment; authentic assessment; and other (please specify). At the end of the variables, respondents were given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses.

As one might expect of the arts, performance assessment (73.25%), assessment OF learning (65.8%), and portfolio (65.8%) ranked very high as the best means of measuring student learning outcomes (see Graph 4.6). These were followed closely by assessment FOR learning, authentic assessment, and classroom assessment, all less than .5% over 63%. Selected response (e.g., multiple choice; true-false) and constructed response (e.g., short answers, essays) ranked very low at 29.3% and 17.1%, respectively. Of note was the low ranking of common assessment (29.3%), indicating that fewer than one in three respondents reported having knowledge of such assessments shared between schools or across all schools within a district. In addition to comments regarding local control, one respondent noted a “hope that districts choose the best mix of formative/summative assessment for the various arts disciplines, and then consistently apply that across district schools.”

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.6 Responses to Item 3, showing best types of assessment to address state standards outcomes, in order of strength based on a total of “4” and “3” (on a scale of 0–4, with 4 being highest)

Question 3. Who is engaged in arts assessment and in what capacity?

Item 4a- In your state, who is engaged in arts assessment decisions, development, and/or initiatives?

Item 4b- In your state, indicate in what capacity the following are engaged in LOCAL (district, school, or classroom) arts assessment decisions, development, and/or initiatives.

(p. 73)

Item 4c- In your state, indicate in what capacity the following are engaged in STATEWIDE arts assessment decisions, development, and/or initiatives.

To learn about stakeholder participation, the survey first posed a general question (Item 4a) about what potential stakeholders were engaged in arts assessment decisions, development, and/or initiatives. Using the same variables as in Items 4b and 4c, we asked DAEs to specify one or more options, as appropriate, within Local Assessments, State Assessments, N/A, and Unknown, as applied to each variable. This covered a variety of known participants, some of whom the data showed were involved at statistically insignificant levels.

As indicated in Graph 4.7, arts teachers (90.2%), district administrators/arts supervisors (75.6%), and school administrators (63.4%) ranked highest for involvement with local decisions, development, and/or initiatives in arts assessment. Statewide arts education associations (45.0%) ranked fourth in this arena, well above other statewide groups. When state activity was considered, 6 of 10 stakeholder groups emerged most frequently: the state education agency (37.5%), arts teachers (36.6%), statewide professional arts education associations (35.0%), higher education (35.0%), district administrators/arts supervisors (31.7%), and the state’s arts consortium/coalition (30.0%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.7 Responses to Item 4a, indicating the degree to which various stakeholders participate in local and state arts assessment decisions, development, and/or initiatives

Items 4b and 4c requested the details of that involvement, with eight variables offered for each stakeholder group (see Table 4.2). Because the reported levels of involvement indicated in Item 4a were statistically insignificant for several groups, Items 4b and 4c were analyzed for the six highest stakeholder groups: arts teachers, school administrators, district administrators/arts supervisors, statewide professional arts associations, the state education agency, and higher education. Non-arts teachers, professional test-development companies, state arts education alliances, and arts consortia/coalitions were omitted.

Table 4.2 Eight Variables Related to Stakeholder Involvement in Arts Assessment Activities

Item 4b—In your state, indicate in what capacity the following are engaged in LOCAL (district, school, or classroom) arts assessment decisions, development, and/or initiatives.

Item 4c—In your state, indicate in what capacity the following are engaged in STATEWIDE arts assessment decisions, development, and/or initiatives.

Professional development delivery

Assessment. design (e.g., types, |timelines, frameworks)

Item development

Piloting, field-testing

Assessment. Administration (e.g., “giving the test,” proctoring, security)

Data analysis

Reporting results

Fiscal agent, funder

Unknown

a. Arts teachers

b. Nonarts teachers

c. School administrators

d. District administrators/arts supervisors

e. Professional test-development company

f. State arts education alliance (e.g., Kennedy Center Alliance)

g. Statewide professional arts education association(s)

h. Office(s)/bureau(s) within the SEA

i. Higher education

j. Arts consortium/coalition

k. Other (please specify)

4a+/4b+ Please describe “Other” and/or elaborate on one or more responses in Question 4a above

(p. 74) Arts teachers and district administrators/arts supervisors reported that they were significantly involved in most areas of local arts assessment activity (see Graph 4.8), with arts teachers involved to a greater degree in assessment design (72.5%), professional development delivery (70.0%), test-item development (70.0%), piloting/field-testing (57.5%), and assessment administration (57.5%) than district administrators/arts supervisors. In the areas of reporting results (55.0%) and data analysis (52.5%), however, district administrators/arts supervisors were reported as involved more often than arts teachers. One in 10 respondents also reported district administrators/arts supervisors functioning as fiscal agents/funders for local arts assessment activity. School administrators were reported by significantly fewer states as involved in most areas of local arts assessment.

The Status of Arts Assessment in the United States

Graph 4.8 Responses to Item 4b, indicating the capacities in which school and district personnel are involved in local arts assessment decisions, development, and/or initiatives

While school and district personnel’s involvement in local arts assessment activity was reported as significant in several areas, nondistrict agencies and organizations were reported as less involved in all areas (see Graph 4.9). Respondents reported that professional development was the most significant contribution as follows: statewide professional arts organization(s) (42.5%), state education agency (40.0%), and higher education (22.5%). Respondents in at least one in five respondents also reported involvement of the state education agency in the areas of reporting results (22.5%), assessment design (22.5%), test-item development (20.0%), and data analysis (20.0%). SEAs were alone in being reported as fiscal agents/funders of local arts assessment activity.

The Status of Arts Assessment in the United States

Graph 4.9 Responses to Item 4b, indicating the capacities in which agencies and organizations are involved in local arts assessment decisions, development, and/or initiatives

Local school and district personnel, particularly arts teachers and district administrators/arts supervisors (shown in Graph 4.10), were most often included in all aspects of state-level decision-making and assessment development. Arts teachers were most prominently involved in four areas: professional development delivery (50.0%), test-item development (45.0%), assessment design (42.5%), and piloting/field-testing (42.5%). (p. 75) District administrators were most likely to be involved in delivery of professional development (42.5%), reporting results (37.5%), and assessment design (35.0%). Few school and district personnel were identified as fiscal agents or funders; in that area only, school administrators were reported as more likely to act as fiscal agents (5.0%) than the other two stakeholder groups.

The Status of Arts Assessment in the United States

Graph 4.10 Responses to Item 4c, indicating the capacities in which school and district personnel are involved in state arts assessment decisions, development, and/or initiatives

(p. 76) As shown in Graph 4.11, statewide professional arts education associations (50.0%) were most likely to provide professional development related to state work in arts assessment, with state education agencies (47.5%) and higher education (30.0%) ranked close behind. In the remaining state-level activities, state education agencies figured most prominently in all areas, and were reported as the sole fiscal agent or funder among all statewide arts stakeholders.

The Status of Arts Assessment in the United States

Graph 4.11 Responses to Item 4c, indicating the capacities in which agencies and organizations are involved in state arts assessment decisions, development, and/or initiatives

Question 4. What (professional development) needs exist and how are they being addressed?

(p. 77)

Item 5a- In your state, what professional development needs exist for you and/or other trainers in arts assessment?

(p. 78)

Item 5b- In your state, how is professional development in (arts) assessment delivered?

Item 5c- In your state, how is professional development in (arts) assessment funded?

Item 6a- Excluding the professional development addressed in Question 5a, what is needed in your state for implementation of arts assessments to support achievement of your state’s arts standards?

Item 6b- In your state, who has reasonable access to the following (arts) assessment resources?

While the initial focus on perceived needs in arts assessment was on professional development, the survey provided an opportunity to examine the “how” of professional development and additional needs, as well. Items 5a, 5b, and 5c examined professional development alone, while Items 6a and 6b examined needs other than professional development. Professional development was examined in terms of needs, delivery, and funding; additional needs were examined in the realm of arts assessment related to achievement of states’ arts standards and access to assessment resources. Because arts education stakeholders often receive assessment training that is not specific to the arts, the survey items were designed to capture all professional development in assessment.

In Item 5a, respondents were asked to identify the professional development needs for themselves and/or other trainers in arts assessment, indicating the strength of need on a scale of 0–4, with 4 being the greatest. The variables to which they were asked to respond were: classroom-level assessments aligned with arts standards; test-item development; types and purposes of assessments; validity and reliability; data analysis; connecting data to instruction; assessment design (e.g., types, timelines, frameworks); copyright requirements in assessment; assessment delivery (e.g., paper-and-pencil, digital); reporting and discussing results; and other (please specify). As with previous items, respondents were given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses.

As shown in Graph 4.12, classroom-level assessments aligned with arts standards (80.0%) and connecting data to instruction (77.5%) were perceived as the highest needs for professional development, with reporting/discussing results (67.5%) and data analysis (65.0%) following closely behind. Validity and reliability, assessment design, and types and purposes of assessments were reported as high needs by 57.5% to 60.0% of all respondents, while professional development related to test-item development, copyright, and assessment delivery were reported at relatively low rates (30.0% to 42.5%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.12 Responses to Item 5a, indicating professional development needs for DAEs and other trainers, in order of strength based on a total of “4” and “3” (on a scale of 0–4, with 4 being highest)

Item 5b asked DAEs to report the strength of professional development needs for themselves, other trainers, and arts teachers on a scale of 0–4, with 4 being the greatest. For purposes of analysis, responses of 4 and 3 were considered, in total.

When asked how professional development was most often delivered to themselves, other trainers, and arts teachers (see Graph 4.13), respondents ranked personal study as the most frequent means of accessing training for all stakeholder groups at rates of 82.5% (DAEs), 70.0% (other trainers), and 80.0% (arts teachers). DAEs also reported their (p. 79) organization, SEADAE (80.0%), as the second most likely source for professional development. In addition to personal study, other trainers were reported as gaining professional development most often from online training (62.5%) and state service (arts education) organizations (47.5%). Following personal study, 70.0% of respondents reported that arts teachers were accessing professional development in (arts) assessment most often from the state service (arts education) organizations, followed by online training (62.5%), higher education (57.5%), and training in their workplace (57.5%).

The Status of Arts Assessment in the United States

Graph 4.13 Responses to Item 5b, indicating the means/sources of professional development delivery to DAEs, other trainers, and arts teachers

(p. 80) In Item 5c, respondents were asked to identify funding sources for professional development, choosing all that applied. This item acknowledged that professional development often focuses on “assessment,” rather than on “arts assessment.” Sources included: SEA; grant(s) (philanthropic); federal (US Department of Education [USDOE]) funds; NEA grant(s); state service organizations; local school/district; out-of-pocket; and other (please specify). Respondents were then given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses.

When queried as to funding sources for professional development, choosing all that applied, 77.5% of respondents reported local schools/districts with greatest frequency, as shown in Graph 4.14. Out-of-pocket (i.e., self-funding) was reported by 47.5% of respondents and state service (arts education) organizations by 45.0% of respondents. While philanthropic grants, federal funds (i.e., USDOE, NEA), and state funds (i.e., SEAs) were reported as sources of funding, they were selected with relatively limited frequency when compared to the professional development funding provided by local schools and districts.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.14 Responses to Item 5c, indicating funding sources for professional development in (arts) assessment, in order of selection frequency

RttT-funded states (83.3%) aligned with non-RttT states (76.5%), reporting local schools/districts most frequently as professional development funders (see Graph 4.15). With access to significant federal funds, these states (50.0%) appeared to be almost equal to their non-RttT counterparts (44.1%) in reporting state service (arts education) organizations as funders of professional development, yet were less reliant on other sources of funding for professional development.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.15 Responses to Item 5c, indicating funding sources for professional development in (arts) assessment, in order of selection frequency in the context of RttT funding for “hard-to-assess” content areas

Items 6a and 6b examined arts assessment needs beyond professional development. Item 6a required respondents to indicate what, other than professional development, is needed for implementation of arts assessments to support achievement of their state’s arts standards. Respondent were directed to choose all that applied: better or more funding; technology, policy; time; local buy-in; SEA support (within chain-of-command); and/or other (please specify). Respondents were then given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses. With the ability to choose (p. 81) “all that apply,” as shown in Graph 4.16, slightly more than three in four respondents to Item 6a indicated a need for better or more funding (77.5%), followed by time (72.5%), local buy-in (67.5%), policy (65.0%), state education agency support (55.0%), and technology (35.0%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.16 Responses to Item 6a, indicating needs other than professional development for implementation of arts assessments to support achievement of state arts standards, in order of selection frequency

Recognizing that resources are only valuable if stakeholders have appropriate access to them prompted Item 6b. The list of variables, shown in Table 4.3, was generated from known resource types in arts assessment development and cross-referenced with important users of that information.

Table 4.3 Stakeholder Access to Arts Assessment Resources

Item 6b—In your state, who has reasonable access to the following? Choose all that apply on each line. Note: This question acknowledges that professional development often focuses on “assessment,” rather than “arts assessment.”

DAE

Local admins.

Assessment Team(s)

Higher ed.

Arts teachers

Public (e.g., parents, students)

NA/Unknown

a. Arts assessment item banks

b. (Arts) assessment experts

c. (Arts) assessment presentations, conferences, institutes

d. Higher education resources (e.g., classes, experts, research)

e. Print resources

f. Released test samples

6b+ Please elaborate on one or more responses in Question 6b above

(p. 82) Access to (arts) assessment resources was analyzed separately for RttT (Graph 4.17) and non-RttT states (Graph 4.18), showing some notable differences. RttT states reported that DAEs (83.3%) had twice the access to test-item banks in the arts than their non-RttT counterparts (41.2%). With the exception of access to (arts) assessment experts, which 50.5% of all RttT DAEs reported having, RttT DAEs and arts teachers had significantly greater access to (arts) assessment presentations, conferences, and institutes (66.7% each); higher education resources (e.g., classes, experts, research) (50.0% each); print resources (66.7% each); and released test samples (66.7% each) than their non-RttT peers.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.17 Responses to Item 6b, indicating stakeholder access to arts assessment resources, as reported by RttT states

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.18 Responses to Item 6b, indicating stakeholder access to arts assessment resources, as reported by non-RttT states

Local administrators and assessment teams in RttT states were reported as having significantly greater access to test-item banks in the arts (50.5% each) and higher education resources (50.0% and 33.3% respectively), and local administrators in these states were reported as having significantly greater access to released test samples (50.0%) than their non-RttT counterparts. In contrast, fewer than one in four (23.5%) arts teachers in non-RttT states were reported as having access to arts assessment item banks, whereas their RttT peers were reported at a rate of one in two (50.0%).

Question 5. How is technology impacting arts assessment?

Item 7- In your state, how is computer-based technology impacting arts assessment practices?

(p. 83) Based on previous anecdotal reports of digitally delivered assessments in music and other arts, and assessment work relative to the National Core Arts Standards, a query was included regarding the role of technology in various facets of arts assessment. Strength of impact was to be identified on a basis of 0–4, with 4 being the greatest. The variables were: test design/development; delivery of assessment (e.g., paper-and-pencil, digital); cost savings; sharing resources; expansion of assessment options (e.g., additional delivery (p. 84) and response modes); delivery of professional development in assessment; student attitudes; evaluation/grading; data analysis; and other (please specify). As with previous items, respondents were given an opportunity to describe “other,” if selected, and/or elaborate on one or more responses. Data was sorted for access to RttT funding and analyzed in frequency groups of strong impact (4 and 3); little or no impact (2 or 1), and no impact (0).

In RttT states (Graph 4.19), test design/development, delivery of assessment, and delivery of professional development, all identified by 66.6% of respondents, were the facets most strongly impacted by the use of technology. Also resulting in high positives were cost savings, expansion of assessment options, and data analysis, selected by 50.0% of all respondents. Technology was most often reported as having little or no impact on sharing resources (16.7%) in RttT states. Of note is that 33.4% of RttT respondents reported the impact of technology on student attitudes as strong.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.19 Responses to Item 7, indicating strength of technology impact on arts assessment activities, reported by RttT states as strong (4 or 3); little or no (2 or 1), or no impact (0)

Respondents from non-RttT states (Graph 4.20) reported similar findings, but to a lesser degree. Although the highest positive was reported for sharing resources (38.3%), slightly more than one in three respondents reported technology as having a strong impact on expansion of assessment options, delivery of professional development, and data analysis (35.3% of all non-RttT respondents). Slightly more than one in four respondents (26.5%) in non-RttT states reported technology as having a strong impact on cost savings and evaluation/grading.

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.20 Responses to Item 7, indicating strength of technology impact on arts assessment activities, reported by non-RttT states as strong (4 or 3); little or no (2 or 1), or no impact (0)

Respondents described broad differences in the role of technology, from advanced statewide use, excluding the arts, to districts embracing its use, despite there being no technology initiative in the state. One respondent cited the critical role technology has played in “all arts learning, for students and teachers, across the state, especially in our most isolated communities.”

Question 6. How are arts assessment results used?

(p. 85) Item 8 required respondents to indicate how arts assessment results are used at local and state levels. For each variable, they could select Local, State, N/A, and/or Unknown. The variables were: instructional planning in the classroom; school evaluation/school grade; teacher accountability; program accountability; funding distribution/penalties; arts program retention/sustainability; assessment restructuring; conference/professional development design; and other (please specify).

As shown in Graph 4.21, respondents most often reported the use of arts assessment data at the local level for instructional planning in the classroom (67.5%); only 2.5% reported using such data at the state level for instructional planning in the classroom. In fact, very little use of arts assessment data is made at the state level: the uses most often reported were for presentation at local conferences/professional development design (25.0%), teacher accountability (22.5%), and school evaluation/school grade (20.0%).

The Status of Arts Assessment in the United StatesClick to view larger

Graph 4.21 Responses to Item 8, indicating the degree to which arts assessment results are used at local and state levels, sorted in order of selection frequency in the context of local usage

Among additional descriptions provided by respondents were several that pointed again to this being a matter of local control. They described data as being used only for parent-teacher conferences or for teacher evaluation, but one respondent noted that several districts in his/her state “that develop and implement such fine arts assessment programs may use the results in many ways as indicated in Item 8.”

At the close of the survey, respondents were given the option of providing links to current or recently developed state resources, initiatives, projects, programs, and so forth in arts assessment. They were also invited to include brief descriptions with each link, as shown in Table 4.4. (p. 86)

Table 4.4 Open-Ended Responses for Additional Resources, Initiatives, Programs, and Projects

#

Optional Response Item

1.

http://ok.gov/sde/arts#Assess

2.

http://www.education.nh.gov/instruction/curriculum/arts/index.htm

3.

www.pdesas.org: This is our state curriculum website. The Homeroom icon takes the participant to assessment models within the SLO process. The Teacher Tools section has a professional learning community called Assessment Literacy. The Publish Your Best community has teacher-developed lesson plans with embedded assessments.

4.

Maryland will begin statewide arts assessment development workshops in February, 2016.

5.

Maine Arts Leadership Initiative: https://www.maineartsassessment.com/; Maine Department of Education Arts Resources: https://www.maine.gov/doe/arts/resources/index.html; Maine Arts Commission: https://mainearts.maine.gov/Pages/Programs/MAAI

6.

Multiple Pathways Initiative in progress; http://www.p12.nysed.gov/ciai/multiple-pathways/; High School (Grade 9) Arts Assessment Sampler Draft and Statewide Assessment of the High School Graduation Requirement for the Arts at http://www.p12.nysed.gov/ciai/arts/pubart.html

7.

New Jersey is a second-tier Race to the Top state. As such, it was required to create a model curriculum framework that could be used by all districts, but was required for the priority districts and strongly encouraged for the state focus schools (those scoring in the bottom 5% and 10% of schools, respectively). The frameworks can be found on the Departments’ website at http://www.state.nj.us/education/modelcurriculum/vpa/. Assessment exemplars have also been developed that align to the Student Learning Outcomes identified in the framework, derived from cumulative progress indicators embedded in the visual and Performing arts. Publication of the assessments is pending and will be available soon.

(p. 87) 8.

http://education.ohio.gov/Topics/Learning-in-Ohio/Fine-Arts

9.

The California Arts Project: https://www.lacountyartsedcollective.org/sites/lacaec/files/media/pdf/materials/complete_handbook.pdf?md=16—This is an overview of the offerings of this project. 2. CCSESA Arts Initiative, Arts Assessment Resource Guide: http://ccsesaarts.org/arts-learning/assessment/3. California Department of Education, Visual and Performing Arts Framework, Chapter 5 Arts Assessment: http://www.cde.ca.gov/ci/cr/cf/documents/vpaframewrk.pdf#search=VAPA%20Framework&view=FitH&pagemode=none

10.

http://www.azed.gov/art-education/assessment/

11.

Delaware Performance Appraisal System-all research, rationale and protocol for implementation of teacher evaluation—http://www.doe.k12.de.us/domain/186

12.

http://www.coloradoplc.org/assessment

13.

New Mexico has developed End of Course exams (EOCs) for Art and Music, and has started looking at Theatre and Dance information. Our Assessment Bureau is developing the EOCs with art and music teachers, higher education contacts, and arts coordinators. The DAE recommended some of the participants, attended some of the meetings, and reviewed the final documents. The DAE looks for the assessments to become more authentic and sophisticated as the process moves forward. Web links: https://webnew.ped.state.nm.us/bureaus/assessment/end-of-course-exams/, then open A to Z, go to A, open assessment/accountability/evaluation, then go to assessment; then to EOCs; then EOC information; and then to EOC blueprints.

14.

Program Review: http://education.ky.gov/curriculum/pgmrev/Pages/default.aspx

15.

A few years back, there were common arts assessments developed and used by some districts. Some of those assessments influenced the NCAS model cornerstone assessments.

16.

http://ncasw.ncdpi.wikispaces.net/

17.

www.michiganassessmentconsortium.org/maeia

18.

Virginia Department of Education Fine Arts Web page: http://www.doe.virginia.gov/instruction/fine_arts/index.shtml

Discussion

Five major findings emerge from the 2015–2016 study:

  • Educator accountability is currently the most consistent driver of arts assessment in the nation.

  • In the absence of state policy and funding, “local control” plays an important role in the provision of arts-specific professional development for educators and in the creation of valid and reliable arts assessments of student learning.

  • Professional development in arts assessment is most often provided online and/or at cost to the consumer.

  • The efficacy of technology in arts assessment is little-known and its power underused.

  • Policy, funding, and time remain the greatest needs for consistently high-quality, research-based assessment of student learning in the arts.

(p. 88) Evaluating educator effectiveness emerges as a key driver of arts assessment across the states, but most particularly in states that were awarded federal RttT funds for the arts as one of the hard-to-measure content areas. By extension, knowledge of the nuts and bolts of arts assessment becomes more critical in this era of accountability.

In our state-centered educational system, which relies heavily on “local control” decisions, the responsibility for arts assessment falls most often to the arts educator in the classroom and, if available, a district administrator/arts supervisor. To support these grassroots efforts, DAEs provide guidance, professional development, access to tools and models, and expertise for statewide and organizational initiatives, as available and allowed within SEA parameters. Outstanding work in arts assessment exists in pockets: schools, school districts, consortia, statewide arts education organizations, states, and national organizations have toiled diligently to provide fair, valid, and reliable assessments, both for students within their purview and as models for others to use in their own work.

Professional development has been identified repeatedly as a requirement to valid and reliable practices in arts assessment; training DAEs and arts teachers, alike, in assessment literacy and arts-specific assessment is critical. Too often, gaps in assessment literacy result in disjointed systems in arts assessment. Consequently, the more assessment literacy one has, the more likely it is that coherent systems will be put in place.

Tracking opportunities for arts-specific professional development and access to resources in arts assessment for teachers and administrators at the local level is challenging; there is currently no vetting system to ensure the quality of professional development offerings. Educators are often left to their own devices to find opportunities online or at conferences, if able to attend, and they often bear the cost for such professional growth opportunities themselves. SEADAE was cited as a significant provider of professional development in arts assessment for DAEs, suggesting that the organization may be positioned to provide portions of that expertise outward in a manner that grassroots educators can readily access from anywhere in the nation.

Anecdotal data also indicates that professional development in assessment provided by schools, districts, and SEAs is often generalized to all content areas or to those addressed by Common Core State Standards and high-stakes testing. In those cases, DAEs and arts teachers must rely on transposing to their own arts discipline(s) what was presented.

Technology has a significant role to play in advancing arts assessment. Stakeholders are largely unclear about how the power of technology can be harnessed to make assessment a more informative instructional tool, yet it can be the breakthrough catalyst for bringing to scale a movement toward quality arts assessment practices. Much can be learned from extant models and tools, and DAEs readily share available best-practice models with peers and other partners, as funding parameters and copyright allow. Of course, these models are most easily shared among arts education stakeholders who work from closely aligned learning standards, but transferability in technology is likely to be high, nonetheless.

(p. 89) Several respondents articulated excellent support within their SEA chain-of-command, regretfully noting, however, that priorities and budget constraints often overcome best intentions. It is not surprising, in this time of high-stakes testing, that many educators across all content areas have come to believe that “what is valued is tested, and what is tested is taught.” Like the legs of a three-legged stool, policy, funding, and time are all vital to a fully supported program of valid and reliable arts assessment. These imperatives translate into priorities, and priorities into perceived value.

Proof is found in Michigan’s Model Arts Education Instruction and Assessment project (MAEIA), the arts arm of the Michigan Assessment Consortium (MAC) (Michigan Assessment Consortium, 2016a, 2016b), a professional association of educators who believe quality education depends on accurate, balanced, and meaningful assessment. The MAC members work to advance assessment literacy and advocate for assessment education and excellence. This project, in which SEADAE is a partner, exemplifies what happens when policy, time, and funding conjoin with technology to advance arts assessment practices.

As a result of the data analysis and review of open-ended responses, several provocative questions emerged:

  • To what degree does a lack of arts assessment policy in states impede engagement with local and state stakeholders in arts assessment activities? Are there deeper issues at hand?

  • Is an arts education sufficiently valued in our culture to be considered an educational priority, equal to other content areas?

  • Will the ESSA (S.1177, 114th Congress, 2015–2016), with its emphasis on providing a well-rounded education, become a pivot point for states to balance content areas more equally, bringing the modest influx of policy, funding, and time so necessary to arts education and assessment of student learning in the arts?

Despite these questions, this much is clear: arts-supportive policy generates positive outcomes. Here is a prime example from a response to the final survey question:

New Jersey is the first state in the nation to include the arts as part of its annual school report card. The result has been that some schools that historically ranked highest among the state’s schools have been bumped by new schools that have high academic achievement levels within their school population and robust arts programs. For the first time that I can recall, principals have gone on record touting the integrity of the arts program offerings as an indicator of school quality. Moreover, within weeks of the Department’s mandate of the inclusion of the arts in the school State Report Cards, the magazine that rates New Jersey’s schools changed its ranking criteria to include access to opportunities for learning in the arts. The reason cited for the change was that having a robust arts curriculum was a powerful indication of its interest in the return to a focus on “the whole child.”

(p. 90) What about policy intentions, and do such intentions matter? For example, states that funded arts assessment work through RttT grants were driven by the requirement for teacher evaluation. These states developed a variety of assessment tools for this singular purpose. Arguments can be made as to whether student achievement alone is a fair measure of teacher effectiveness; regardless, this was the driver that moved arts assessment practices forward in RttT states.

The federal policy shift from NCLB to ESSA provides a landmark opportunity for the arts to be included in local and state accountability systems in the United States. Possibilities abound with this new legislation: because of ESSA’s emphasis on provision of a well-rounded public education to all students and its flexibility for local districts and states to redesign aspects of their accountability systems, the stage is set for significant modifications in how states and schools demonstrate high-quality education. States and schools may now be able to shift the focus of using student assessment results as a measure for educator evaluation to the more relevant purpose of using assessment results to improve teaching and learning.

The comprehensive goals expressed in A Facets Model for State Arts Assessment Leaders (Brophy, 2014) (see Figure 4.2), presented at the SEADAE Arts Assessment Institute (July 2014), identify focal points to guide DAEs in the work ahead. The facets model relies on professional development for and by state arts leaders and teachers.

The Status of Arts Assessment in the United StatesClick to view larger

Figure 4.2 A facets model for state arts assessment leaders (Brophy, 2014).

The facets of arts assessment leadership are a subset of knowledge and skills that pertain to successful leadership in arts education. Many facets are new applications to, or uses of, regularly employed skills; others are extensions of current tasks. These facets are interdependent, rather than discrete, working in tandem with other skills to advance arts assessment in any given state. Of course, assessment literacy is foundational to this work, as well. State-level leadership depends on using all facets in a combination optimized for each individual.

Under ESSA Section 8002: Definitions (Professional Development) (S.1177, p. 290), federal funds are available to support professional development for educators. Title IIA includes an emphasis on content-knowledge development. Instead of arts teachers spending professional development time listening to detailed data analyses of student performance in reading and mathematics, they could be engaged in such content-specific professional development as increasing their ability to assess and analyze students’ achievement in the arts.

There is significant momentum in arts assessment across most of the nation. SEADAE has learned a great deal from states’ earlier work (e.g., Colorado, Connecticut, Florida, Kentucky, Michigan, South Carolina, Texas, and Washington) and has model projects in MAEIA, the MCAs, and state initiatives that draw on the flexibility of ESSA to create assessments in the arts.

The will to engage in arts assessment activity is strong. Technology exists to assist in the delivery of professional development; provide accessible and economical means of administering and analyzing assessments; and capturing, storing, and sharing student work. Policy changes at the federal level may well work in favor of arts education. (p. 91) Through leadership, strong organizational relationships, and a shared vision for change, the United States is on the precipice of an arts assessment evolution.

Suggestions for Further Study

In addition to the questions posed in the Discussion, several areas for further study remain. Some were set aside as being too cumbersome to include in this study; others arose during the administration of the survey and subsequent data analysis. They are:

  • What knowledge can be gleaned from an analysis of arts assessment policy and practice in states having strong policy in place?

  • To what degree do arts education stakeholders have access to professional development that addresses “arts-specific assessment,” rather than “assessment?” Does it make a difference?

  • (p. 92) How can SEADAE facilitate provision of widely available professional development to arts education stakeholders, including those at the grassroots level, and in what areas should professional development be provided by SEADAE?

  • What are the benefits and challenges of cross-pollinating assessment practices across arts disciplines and across state lines?

As arts educators across the nation become involved in the unfolding of state and local practice under the ESSA, change is inevitable. What is certain is that DAEs are prepared and eager for arts education stakeholders at all levels to commit to and become engaged in well-designed arts assessment for today’s authentic approach to measuring student outcomes.

References

Brophy, T. S. (2014). A facets model for state arts assessment leaders. Gainesville: University of Florida. Retrieved July 28–30, 2014.Find this resource:

    College Board. (2011a, August). Arts education standards and 21st century skills: An analysis of the national standards for arts education (1994) as compared to the 21st century skills map for the arts. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research-%20%20P21%20Report.pdf

    College Board. (2011b, August). International arts education standards: A survey of the arts education standards and practices of thirteen countries and regions. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research%20-%20International%20Standards_0.pdf

    College Board. (2011c, November). A review of selected state arts standards. Retrieved from https://www.nationalartsstandards.org/sites/default/files/NCCAS%20State%20and%20Media%20Arts%20report.pdf

    College Board. (2012a, January). Child development and arts education: A review of current research and practices. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research%20-%20Child%20Development%20Report.pdf

    College Board. (2012b, August). College-level expectations in the arts. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research%20-%20College%20Expectations%20Report.pdf

    College Board. (2012c, December). The Arts and the common core: A review of the connections between the common core state standards and the national core arts standards conceptual framework. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research%20-%20Arts%20and%20Common%20Core%20-%20final%20report1.pdf

    College Board. (2014, July). The arts and the common core: A comparison of the national core arts standards and the common core state standards. Retrieved from https://www.nationalartsstandards.org/sites/default/files/College%20Board%20Research%20-%20Arts%20and%20Common%20Core%20-%20final%20report1.pdf

    Consortium of National Arts Education Associations. (1994). National standards for arts education: What every young American should know and be able to do in arts education. Reston, VA: Author.Find this resource:

      (p. 93) Great Schools Partnership. (2016). The glossary of education reform for journalists, parents, and community. Retrieved from https://www.greatschoolspartnership.org/

      Lovins, L. T. (2010). Assessment in the arts: An overview of states’ practices and status. In T. S. Brophy (Ed.), The practice of assessment in music education: Frameworks, models, and designs (pp. 23–42). Chicago, IL: GIA Publications.Find this resource:

        Michigan Assessment Consortium. (2016a). About MAC. Retrieved from http://www.michiganassessmentconsortium.org/about-mac

        Michigan Assessment Consortium. (2016b). MAEIA: Michigan’s Model Arts Education Instruction and Assessment project. Retrieved from http://www.michiganassessmentconsortium.org/maeia

        National Archives. (n.d.) United States Constitution, Article the 12th, Amendment X. Retrieved from https://www.archives.gov/founding-docs/bill-of-rights

        National Assessment Governing Board. (2008). 2008 arts education assessment framework. Retrieved from https://www.nagb.gov/content/nagb/assets/documents/publications/frameworks/arts/2008-arts-framework.pdf

        National Association for Music Education. (2016). ESSA implementation and music education: Opportunities abound. Retrieved from https://nafme.org/wp-content/files/2015/11/ESEA-Implementation-and-Music-Education-Opportunities-Abound-FINAL.pdf

        National Coalition for Core Arts Standards. (2014). National core arts standards. Retrieved from https://nationalartsstandards.org/

        National Coalition for Core Arts Standards. (2016). History. Retrieved from http://nccaswikispaces.com/history

        S.1177. Every Student Succeeds Act, Pub. L. No. 114-95 (2015). (114th Congress). Retrieved from https://www.gpo.gov/fdsys/pkg/BILLS-114s1177enr/pdf/BILLS-114s1177enr.pdf

        State Education Agency Directors of Arts Education (SEADAE). (2012). About SEADAE. Retrieved from http://seadae.org/about.aspx

        State Education Agency Directors of Arts Education (SEADAE). (2016). Status of arts assessment in the USA. Retrieved from https://www.nationalartsstandards.org/sites/default/files/NCCAS-State-Reports-Since-2014.pdf

        US Department of Education. (2009, December). Fact sheet: The Race to the Top. https://www2.ed.gov/programs/racetothetop/factsheet.pdf (p. 94)