12.07.2015 Views

Accommodations on the Stanford-Binet Intelligence Scales

Accommodations on the Stanford-Binet Intelligence Scales

Accommodations on the Stanford-Binet Intelligence Scales

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Copyright © 2003 by The Riverside Publishing Company. All rights reserved. Permissi<strong>on</strong> is hereby granted tophotocopy <strong>the</strong> pages of this booklet. These copies may not be sold and fur<strong>the</strong>r distributi<strong>on</strong> is expressly prohibited.Except as authorized above, prior written permissi<strong>on</strong> must be obtained from The Riverside Publishing Company toreproduce or transmit this work or porti<strong>on</strong>s <strong>the</strong>reof, in any o<strong>the</strong>r form or by any o<strong>the</strong>r electr<strong>on</strong>ic or mechanicalmeans, including any informati<strong>on</strong> storage or retrieval system, unless such copying is expressly permitted byfederal copyright law. Address inquiries to Permissi<strong>on</strong>s Department, The Riverside Publishing Company,425 Spring Lake Drive, Itasca, IL 60143-2079.Printed in <strong>the</strong> United States of America.Reference Citati<strong>on</strong>■To cite this document, use:Braden, J. P., and Elliott, S. N. (2003). <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> <strong>on</strong> <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong>.(<strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong> Assessment Service Bulletin No. 2). Itasca, IL:Riverside Publishing.TerraNova CTBS is a trademark of <strong>the</strong> McGraw-Hill Companies, Inc.For technical informati<strong>on</strong>, please call 1.800.323.9540, or visit our website at www.stanford-binet.com, or e-mail usat rpcwebmaster@hmco.com2 3 4 5 6 – BDN – 06 05 04 03


<str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> <strong>on</strong> <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong><strong>Scales</strong>, Fifth Editi<strong>on</strong>Purposes of Accommodati<strong>on</strong>There are three reas<strong>on</strong>s why examiners using <strong>the</strong> SB5 must c<strong>on</strong>sideraccommodating clients who have, or who are suspected of having, disabilities:(a) laws, litigati<strong>on</strong>, and regulati<strong>on</strong>s requiring accommodati<strong>on</strong>s; (b) ethical codes ofc<strong>on</strong>duct; and (c) enhanced validity of assessment. A discussi<strong>on</strong> of each of <strong>the</strong>sereas<strong>on</strong>s follows.LawsFederal civil rights laws provide individuals with disabilities <strong>the</strong> legal rights toaccommodati<strong>on</strong>s. Title VI of <strong>the</strong> Civil Rights Act (1964), Title IX of <strong>the</strong> Educati<strong>on</strong>Amendments (1972), and Title II of <strong>the</strong> Americans with Disabilities Act (1990)mandate access for individuals with disabilities in a variety of assessmentprocesses and settings. Fur<strong>the</strong>rmore, federal legislati<strong>on</strong> requires entities receivingfederal funds to provide access and accommodati<strong>on</strong>s to individuals withdisabilities (e.g., Secti<strong>on</strong> 504 of <strong>the</strong> Rehabilitati<strong>on</strong> Act of 1973), and <strong>the</strong>Individuals with Disabilities Educati<strong>on</strong> Act (IDEA) (1997) provides protecti<strong>on</strong>s toindividuals with disabilities in <strong>the</strong> assessment process. Toge<strong>the</strong>r, <strong>the</strong>se lawsprovide a clear mandate to require accommodati<strong>on</strong>s for individuals withdisabilities in assessment processes, whe<strong>the</strong>r those processes occur in public orprivate settings (Office of Civil Rights, 2000). Fur<strong>the</strong>rmore, <strong>the</strong> right toaccommodati<strong>on</strong>s extends to all individuals with disabilities (i.e., children andadults). Therefore, examiners must c<strong>on</strong>sider testing accommodati<strong>on</strong>s whenassessing individuals who have or who are suspected of having a disability.EthicsMost professi<strong>on</strong>al organizati<strong>on</strong>s promulgate ethical standards to guideappropriate c<strong>on</strong>duct in professi<strong>on</strong>al practice. Psychological ethics standards (e.g.,American Psychological Associati<strong>on</strong>, 2002; Nati<strong>on</strong>al Associati<strong>on</strong> of SchoolPsychologists, 2000) provide some general principles to guide practice (e.g.,resp<strong>on</strong>ding to clients’ unique needs including sensitivity to disabilities). Theethical standards clearly state that examiners must c<strong>on</strong>sider accommodati<strong>on</strong>s forindividuals with disabilities and must use accommodati<strong>on</strong>s appropriately whenassessing an examinee with a disability. Unfortunately, <strong>the</strong>se standards are notspecific in addressing issues related to testing accommodati<strong>on</strong>s.Fortunately, <strong>the</strong>se same ethical standards usually direct professi<strong>on</strong>als toc<strong>on</strong>sult <strong>the</strong> Standards for Educati<strong>on</strong>al and Psychological Testing (Standards)(American Educati<strong>on</strong>al Research Associati<strong>on</strong>, American Psychological Associati<strong>on</strong>,1


ScanPilot ®Na primeira iniciação do SilverFast você é cumprimentado pelo ScanPilot. Ao apertar obotão Iniciar, o ScanPilot assume as etapas preestabelecidos na barra de botões. Nesteexemplo elas são: Predigitalização, Seleção do quadro (você pode definir a área a serdigitalizada) e C<strong>on</strong>cluir. Após estas etapas o modelo está digitalizado na escala 1:1.Você pode interromper este processo a qualquer momento através do botão Parar einterferir manualmente, utilizando você mesmo as ferramentas de SilverFast.Apertando o botão Prefs, a lista deferramentas que devem ser usadasou disp<strong>on</strong>ibilizadas pelo ScanPilot, aparece epode ser modificada com cliques nas caixas deverificação. O ScanPilot mostra a melhor (profissi<strong>on</strong>al)seqüência para as ferramentas. Uma descriçãodetalhada das ferramentas se enc<strong>on</strong>tramais adiante neste manual e no manual c<strong>on</strong>tid<strong>on</strong>o CD.Inicia o ScanPilotPára o ScanPilotAjuda em SilverFastSempre que o p<strong>on</strong>teiro do mouse é posici<strong>on</strong>ado sobre uma área ativa (janela, botão, menude SilverFast), um breve texto de ajuda aparece na linha inferior da janela de pré-digitalização.Ao clicar nos botões com p<strong>on</strong>to de interrogação de SilverFast, chega-se aos arquivos deajuda (Acrobat PDF). Com um clique no botão "Q", breves filmes "QuickTime" são iniciados.Informações detalhadas sobre todas as funções de SilverFast c<strong>on</strong>stam no manual (daversão plena) que está junto a este CD como arquivo Acrobat Reader PDF.Outras informações úteis ou atualíssimas referentes a SilverFast, você acha na nossahomepage em:http://www.silverfast.com/overview/pt.htmlAqui você acha, também, todos os filmes de treinamento no formato QuickTime atéagora criados.65


than <strong>the</strong>y would reflect <strong>the</strong> examinee’s cognitive abilities. Therefore, those scoreswould have significant c<strong>on</strong>struct-irrelevant variance that would invalidate <strong>the</strong>irmeaning (i.e., <strong>the</strong>y would not represent cognitive processes because <strong>the</strong>y areinfluenced by visual acuity). Therefore, examiners must make testingaccommodati<strong>on</strong>s when administering <strong>the</strong> SB5 to reduce or eliminatec<strong>on</strong>struct-irrelevant variance as an influence <strong>on</strong> test scores.Legal Issues in AssessmentWriting from a legal perspective, Phillips (1993, 1994) argued that examineeswith disabilities must have appropriate access to assessments. To ensure access,examiners must provide examinees with appropriate accommodati<strong>on</strong>s. Althoughearly litigati<strong>on</strong> and legal resp<strong>on</strong>ses focused <strong>on</strong> issues such as physical accessto testing accommodati<strong>on</strong>s (e.g., college test companies must providewheelchair-accessible furniture when administering tests), <strong>the</strong> discussi<strong>on</strong> hasexpanded to balance <strong>the</strong> rights of test takers against <strong>the</strong> obligati<strong>on</strong>s of testproviders to ensure accommodati<strong>on</strong>s are fair and equitable. To guide <strong>the</strong>sedecisi<strong>on</strong>s, Phillips draws a distincti<strong>on</strong> between two types of skills thatassessments require: access skills and target skills.Access SkillsAssessment c<strong>on</strong>diti<strong>on</strong>s require or assume that examinees have certain skills. Forexample, reading achievement tests require examinees to be able to see test items(visual acuity skills), sit upright throughout <strong>the</strong> test sessi<strong>on</strong> (gross motor skills),indicate correct answers (fine motor skills), and work <strong>on</strong> test items throughout <strong>the</strong>testing interval (attenti<strong>on</strong>, stamina skills). These skills are not <strong>the</strong> focus of <strong>the</strong>assessment; ra<strong>the</strong>r, <strong>the</strong>y are skills required to access <strong>the</strong> test items and c<strong>on</strong>tent.Therefore, <strong>the</strong>se skills are c<strong>on</strong>sidered access skills. <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> to addressaccess skills level <strong>the</strong> playing field for examinees with disabilities because <strong>the</strong>yremove <strong>the</strong> barriers <strong>the</strong>se examinees have and allow examinees to show <strong>the</strong>irtrue skills. Fur<strong>the</strong>rmore, examinees with disabilities have <strong>the</strong> right, and testadministrators have <strong>the</strong> obligati<strong>on</strong>, to adapt <strong>the</strong> testing situati<strong>on</strong> to minimize <strong>the</strong>effects of <strong>the</strong>se access skills <strong>on</strong> test outcomes. Therefore, testing accommodati<strong>on</strong>sshould address access skills so that examinees are afforded appropriate access toassessments.Target SkillsIn c<strong>on</strong>trast to access skills, target skills are <strong>the</strong> skills that assessments areintended to measure or reflect. Examinees with disabilities do not have <strong>the</strong> rightto accommodati<strong>on</strong>s that address target skills because such accommodati<strong>on</strong>swould be unfair (i.e., <strong>the</strong>y would incorrectly inflate <strong>the</strong> examinee’s scores relativeto those who did not have accommodati<strong>on</strong>s). To return to <strong>the</strong> example of a readingachievement test, some of <strong>the</strong> target skills in <strong>the</strong> test would be <strong>the</strong> ability to decodetext (word reading), understand text (comprehensi<strong>on</strong>), and apply previousknowledge to informati<strong>on</strong> in <strong>the</strong> text (evaluate and extend meaning). Providing alarge print versi<strong>on</strong> of <strong>the</strong> test to an examinee with a visual impairment would beappropriate (i.e., <strong>the</strong> accommodati<strong>on</strong> addresses <strong>the</strong> access skill of visual acuity), butreading <strong>the</strong> test aloud to <strong>the</strong> examinee would not be appropriate (i.e., it changes <strong>the</strong>target skill—decoding text—by substituting a different skill—listening comprehensi<strong>on</strong>).<str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> that alter target skills are inappropriate and are likely to4


violate <strong>the</strong> rights of o<strong>the</strong>r test takers by providing an unfair advantage toexaminees with disabilities. However, accommodati<strong>on</strong>s that address access skillsare not <strong>on</strong>ly allowed but are required because <strong>the</strong>y protect <strong>the</strong> examinee’s right togain <strong>the</strong> same access to <strong>the</strong> test afforded to examinees without disabilities.There are str<strong>on</strong>g parallels between validity and legal perspectives <strong>on</strong> testingaccommodati<strong>on</strong>s. Both approaches direct examiners to ensure that assessmentaccommodati<strong>on</strong>s retain adequate c<strong>on</strong>struct representati<strong>on</strong> (i.e., do not affecttarget skills) and c<strong>on</strong>currently reduce c<strong>on</strong>struct-irrelevant variance (i.e., reduce oreliminate <strong>the</strong> influence of access skills).<str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> Versus Modificati<strong>on</strong>sC<strong>on</strong>temporary research (e.g., Thurlow, Elliott, & Ysseldyke, 1998) and <strong>the</strong>Standards (AERA, APA, NCME, 1999) c<strong>on</strong>cur in drawing a distincti<strong>on</strong> betweentest accommodati<strong>on</strong>s and test modificati<strong>on</strong>s. Test accommodati<strong>on</strong>s are changes to<strong>the</strong> testing situati<strong>on</strong> that retain c<strong>on</strong>struct representati<strong>on</strong> (i.e., do not affect targetskills), while reducing c<strong>on</strong>struct-irrelevant variance (i.e., address access skills). Inc<strong>on</strong>trast, test modificati<strong>on</strong>s are changes to testing that affect <strong>the</strong> c<strong>on</strong>struct ortarget skills of <strong>the</strong> test (e.g., providing easier items, substituting tests of o<strong>the</strong>rskills). Therefore, test accommodati<strong>on</strong>s are legally mandated and appropriate toensure validity, whereas test modificati<strong>on</strong>s are nei<strong>the</strong>r mandated nor desirablebecause <strong>the</strong>y are likely to alter validity.C<strong>on</strong>clusi<strong>on</strong>sHistorically, cognitive assessment research and practice addressing <strong>the</strong> needs ofdisabled examinees have approached <strong>the</strong> issue of accommodati<strong>on</strong>s in anintuitive manner (Braden, in press). Clinicians, and <strong>the</strong> ethical standards andpractices <strong>the</strong>y have developed, have been intrinsically more c<strong>on</strong>cerned withreducing c<strong>on</strong>struct-irrelevant variance than with ensuring adequate c<strong>on</strong>structrepresentati<strong>on</strong>. That is, fairness in resp<strong>on</strong>se to client c<strong>on</strong>cerns has receivedgreater attenti<strong>on</strong> than ensuring adequate breadth in assessments. However,advances in cognitive <strong>the</strong>ory have increased <strong>the</strong> degree to which clinicians canidentify, and thus adequately include, c<strong>on</strong>struct representati<strong>on</strong> in cognitiveassessments (Braden).C<strong>on</strong>temporary legal and ethical developments also provide a mandate fortesting accommodati<strong>on</strong>s. Moreover, scientific and legal perspectives provideprinciples that examiners can use to guide decisi<strong>on</strong>s about whe<strong>the</strong>r, when, andhow to accommodate examinees with disabilities in assessments. The next secti<strong>on</strong>of this bulletin reviews <strong>the</strong> literature <strong>on</strong> assessment accommodati<strong>on</strong>s to provide abackground for <strong>the</strong> recommendati<strong>on</strong>s that follow.Assessment <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> ResearchThe key questi<strong>on</strong> driving assessment accommodati<strong>on</strong>s research is whe<strong>the</strong>r <strong>the</strong>scores with and without accommodati<strong>on</strong>s are comparable (Phillips, 1994). Ineffect, do scores from n<strong>on</strong>standard test administrati<strong>on</strong>s mean <strong>the</strong> same thing asscores from standard test administrati<strong>on</strong>s? Test standardizati<strong>on</strong> is <strong>the</strong> traditi<strong>on</strong>almethod for making test results am<strong>on</strong>g examinees comparable (McD<strong>on</strong>nell,McLaughlin, & Moris<strong>on</strong>, 1997). Standardizati<strong>on</strong>, however, may reduce <strong>the</strong>5


comparability of scores for students with disabilities because <strong>the</strong> disability itselfbiases <strong>the</strong> score by creating c<strong>on</strong>struct-irrelevant variance in <strong>the</strong> score (McD<strong>on</strong>nellet al., p. 173). Therefore, <strong>the</strong> use of accommodati<strong>on</strong>s reduces that variati<strong>on</strong> inscore caused by a disability but precludes standardizati<strong>on</strong> becauseaccommodati<strong>on</strong>s, by <strong>the</strong>ir very nature, change <strong>the</strong> standard administrati<strong>on</strong> of <strong>the</strong>test. The essential questi<strong>on</strong>s are: At what point does <strong>the</strong> change from standardtest administrati<strong>on</strong> intended to improve score comparability actually change <strong>the</strong>task and harm score comparability? and How does an examiner know whe<strong>the</strong>r<strong>the</strong> tasks and resulting scores are no l<strong>on</strong>ger comparable?Interestingly, most of <strong>the</strong> research addressing <strong>the</strong>se questi<strong>on</strong>s has beenundertaken in assessment c<strong>on</strong>texts not involving intelligence tests. That is, <strong>the</strong>majority of research focuses <strong>on</strong> <strong>the</strong> inclusi<strong>on</strong> of test takers with disabilities inn<strong>on</strong>clinical assessments, such as large-scale achievement testing, college entranceexams, or employment tests. Research in c<strong>on</strong>texts o<strong>the</strong>r than intelligence testingis reviewed first, and <strong>the</strong>n a review of available research using previous editi<strong>on</strong>sof <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> and o<strong>the</strong>r clinical tests of intelligence is presented.Decisi<strong>on</strong>s About Score Comparability: The Roleof Different Research DesignsTindal (1998) proposed three models for making decisi<strong>on</strong>s about task comparability.These models are descriptive (model 1), comparative (model 2), and experimental(model 3) in nature. The focus of all three models is to determine whe<strong>the</strong>r <strong>the</strong>c<strong>on</strong>struct that is being measured changes as a result of testing accommodati<strong>on</strong>s.The models represent a c<strong>on</strong>tinuum of evidence for task comparability from <strong>the</strong>weakest evidence (descriptive) to <strong>the</strong> str<strong>on</strong>gest (experimental).In Tindal’s (1998) descriptive model, evidence about task comparability relies<strong>on</strong> current policy for decisi<strong>on</strong>-making. Many testing accommodati<strong>on</strong> policiesprovide descriptive evidence because <strong>the</strong>y do not offer explanati<strong>on</strong>s orjustificati<strong>on</strong>s of why judgments about accommodati<strong>on</strong>s in policy are made.Instead, policies can be created based <strong>on</strong> external informati<strong>on</strong> (e.g., <strong>the</strong> policies ofo<strong>the</strong>r states) or may offer simple procedural recommendati<strong>on</strong>s for selecting andusing accommodati<strong>on</strong>s without stating a rati<strong>on</strong>ale for doing so. As part of <strong>the</strong>irpolicies, school districts may track <strong>the</strong> implementati<strong>on</strong> of various accommodati<strong>on</strong>sto understand how frequently accommodati<strong>on</strong>s are selected and used for certaintasks. This tracking may provide fur<strong>the</strong>r descriptive evidence about <strong>the</strong>relati<strong>on</strong>ship of tasks to <strong>on</strong>e ano<strong>the</strong>r when accommodati<strong>on</strong>s are provided.The comparative model moves bey<strong>on</strong>d descriptive informati<strong>on</strong> by relying <strong>on</strong>multiple sources of data to judge comparability (Tindal, 1998). These data provideretrospective informati<strong>on</strong> about <strong>the</strong> use of accommodati<strong>on</strong>s, judgments about <strong>the</strong>irappropriateness, and performance outcomes to make relati<strong>on</strong>al statements aboutaccommodati<strong>on</strong>s and performance. However, because of its post-hoc nature, causeand effect relati<strong>on</strong>ships between accommodati<strong>on</strong>s and outcomes cannot be inferred.Threats to internal validity, such as selecti<strong>on</strong> bias, participant maturati<strong>on</strong>, or lackof comparis<strong>on</strong> groups, exist that compromise <strong>the</strong> utility of this approach for makingmeaningful and appropriate decisi<strong>on</strong>s about task comparability.Some of <strong>the</strong> aforementi<strong>on</strong>ed threats to validity can be avoided by establishinga research design prior to <strong>the</strong> collecti<strong>on</strong> of data; this is <strong>the</strong> hallmark of <strong>the</strong>experimental model for determining task comparability (Tindal, 1998).6


Experimental models include research designs and technically soundmeasurements to enhance <strong>the</strong> likelihood that meaningful inferences aboutoutcomes can be made. Both large-group and single-case designs fall under <strong>the</strong>realm of <strong>the</strong> experimental model. To be informative, <strong>the</strong>se designs must provideinferences about cause and effect <strong>on</strong> <strong>the</strong> general populati<strong>on</strong> with statisticalc<strong>on</strong>clusi<strong>on</strong>s. In large-group designs, groups of students are compared with <strong>the</strong>mselvesor with o<strong>the</strong>r groups under varying test-taking c<strong>on</strong>diti<strong>on</strong>s. In single-case designs,<strong>the</strong> performance of <strong>the</strong> same student under varying test-taking c<strong>on</strong>diti<strong>on</strong>s iscompared. The results of experimental studies are likely to provide <strong>the</strong> bestevidence for task and score comparability because <strong>the</strong>se studies account for <strong>the</strong>limitati<strong>on</strong>s encountered by descriptive or comparative methods.Of course, most educators or assessment professi<strong>on</strong>als are unlikely toundertake an experimental approach to determine if a testing accommodati<strong>on</strong> iseffective and valid. They most often will be c<strong>on</strong>fr<strong>on</strong>ted with <strong>the</strong> need for and useof testing accommodati<strong>on</strong>s during a referral or individualized educati<strong>on</strong> plan(IEP) meeting and may need assistance in selecting, planning, and implementingtesting accommodati<strong>on</strong>s. Research suggests that most educators and assessmentprofessi<strong>on</strong>als will benefit by using a structured process to make <strong>the</strong>se decisi<strong>on</strong>s(Elliott, Kratochwill, & Schulte, 1999). Phillips (1994) suggested <strong>the</strong> following fivequesti<strong>on</strong>s that assessment professi<strong>on</strong>als might c<strong>on</strong>sider when looking atdepartures from standard testing c<strong>on</strong>diti<strong>on</strong>s:❑❑❑❑❑Will <strong>the</strong> skills measured be altered by changes to <strong>the</strong> format or testingc<strong>on</strong>diti<strong>on</strong>s?Will scores mean <strong>the</strong> same thing for examinees depending <strong>on</strong> whe<strong>the</strong>r <strong>the</strong>yobtained <strong>the</strong>m under standard or accommodated testing c<strong>on</strong>diti<strong>on</strong>s?Would receiving <strong>the</strong> same accommodati<strong>on</strong> benefit n<strong>on</strong>disabled examinees?Might <strong>the</strong> examinee with <strong>the</strong> disability be able to adapt to <strong>the</strong> c<strong>on</strong>diti<strong>on</strong>s ofstandard test administrati<strong>on</strong>?Do procedures of limited reliability and validity affect ei<strong>the</strong>r <strong>the</strong> evidence of<strong>the</strong> examinee’s disability or <strong>the</strong> policy for testing accommodati<strong>on</strong>s?Phillips argued that an accommodati<strong>on</strong> might be inappropriate if <strong>the</strong> answer toany of <strong>the</strong>se questi<strong>on</strong>s is “yes.” However, <strong>the</strong> assessment specialist will always bein a somewhat difficult positi<strong>on</strong>, balancing <strong>the</strong> need to maintain a valid (and asmuch as possible, standard) assessment activity versus <strong>the</strong> rights of examineeswith disabilities. As Phillips put it, “<strong>the</strong> goals of providing maximum participati<strong>on</strong>in society for <strong>the</strong> disabled and maintaining <strong>the</strong> validity of <strong>the</strong> testing programmay be at odds” (p. 104).Testing <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>: What Does <strong>the</strong> ResearchLiterature Tell Us?Elliott, Kratochwill, and McKevitt (2001) c<strong>on</strong>ducted a study designed to(a) describe <strong>the</strong> nature of informati<strong>on</strong> <strong>on</strong> testing accommodati<strong>on</strong>s listed <strong>on</strong>students’ IEPs, (b) document <strong>the</strong> testing accommodati<strong>on</strong>s educators actually usewhen assessing students via performance assessment tasks, and (c) examine <strong>the</strong>effect accommodati<strong>on</strong>s have <strong>on</strong> <strong>the</strong> test results of students with and withoutdisabilities. Participants in <strong>the</strong> study included 218 fourth-grade students from7


disabilities. The authors predicted that accommodati<strong>on</strong>s would significantlyimprove <strong>the</strong> test scores of students with disabilities but would not significantlyimprove <strong>the</strong> test scores of students without disabilities. Participants in <strong>the</strong>study were 86 fourth-grade students, including 43 students with disabilities(entitled students with mild disabilities) and 43 students without disabilities.The students’ performances were measured <strong>on</strong> two equivalent versi<strong>on</strong>s of <strong>the</strong>ma<strong>the</strong>matics test of <strong>the</strong> TerraNova CTBS TM Multiple Assessments(CTB/McGraw-Hill, 1994–2000), designed to align with <strong>the</strong> Standards of <strong>the</strong>Nati<strong>on</strong>al Council of Teachers of Ma<strong>the</strong>matics (1989).Teachers of participants who had disabilities reviewed <strong>the</strong>ir IEPs todetermine which accommodati<strong>on</strong>s <strong>the</strong> research team would use. Each studentwho did not have a disability was paired with a student who did have adisability, and <strong>the</strong> research team administered <strong>the</strong> TerraNova CTBS MultipleAssessments (CTB/McGraw-Hill, 1994–2000) to <strong>the</strong> students in pairs. Bothstudents in each pair received <strong>the</strong> accommodati<strong>on</strong>s outlined <strong>on</strong> <strong>the</strong> IEP of <strong>the</strong>student who had <strong>the</strong> disability. All students participated in a practice sessi<strong>on</strong> tobecome familiar with <strong>the</strong> testing procedures and accommodati<strong>on</strong>s, and allstudents took <strong>on</strong>e versi<strong>on</strong> of <strong>the</strong> test with accommodati<strong>on</strong>s and <strong>on</strong>e versi<strong>on</strong> of<strong>the</strong> test without accommodati<strong>on</strong>s. The researchers randomly assigned <strong>the</strong> orderof accommodated and n<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>s as well as <strong>the</strong> pairs ofstudents. The key independent variables in <strong>the</strong> study were testing c<strong>on</strong>diti<strong>on</strong>(accommodated versus n<strong>on</strong>accommodated) and disability status (with disabilityversus without disability). The dependent variables in <strong>the</strong> study were <strong>the</strong> scoresfrom <strong>the</strong> TerraNova CTBS Multiple Assessments. The results showed that bothgroups improved significantly when <strong>the</strong> accommodated c<strong>on</strong>diti<strong>on</strong> was comparedto <strong>the</strong> n<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>. However, students with disabilities benefitedmore from accommodati<strong>on</strong>s <strong>on</strong> multiple-choice questi<strong>on</strong>s, and both groupsbenefited equally <strong>on</strong> c<strong>on</strong>structed-resp<strong>on</strong>se questi<strong>on</strong>s. For multiple-choicequesti<strong>on</strong>s c<strong>on</strong>sidered al<strong>on</strong>e, students with disabilities yielded an effect size of .41between accommodated and n<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>s, while studentswithout disabilities yielded an effect size of 0. On c<strong>on</strong>structed-resp<strong>on</strong>sequesti<strong>on</strong>s al<strong>on</strong>e, those effect sizes were .31 and .35, respectively. On anindividual level, <strong>the</strong>re was essentially no difference between <strong>the</strong> effects ofaccommodati<strong>on</strong>s <strong>on</strong> students with disabilities and <strong>the</strong> effects of accommodati<strong>on</strong>s<strong>on</strong> students without disabilities. Twenty-seven out of 43 students withdisabilities, and 29 out of 43 students without disabilities, achieved higher scores<strong>on</strong> <strong>the</strong> test when accommodati<strong>on</strong>s were available. Seventeen out of 43 studentswith disabilities, and 16 out of 43 students without disabilities, achieved higherproficiency levels <strong>on</strong> <strong>the</strong> test when accommodati<strong>on</strong>s were available. Twenty outof 43 students with disabilities, and 21 out of 43 students without disabilities,experienced no change in proficiency levels <strong>on</strong> <strong>the</strong> test when accommodati<strong>on</strong>swere available.The finding that both groups of students experienced benefits from testingaccommodati<strong>on</strong>s indicates that <strong>the</strong> changes in test procedure may be affectingboth c<strong>on</strong>struct-relevant and c<strong>on</strong>struct-irrelevant variance. The differentialinteracti<strong>on</strong> between accommodati<strong>on</strong> group and questi<strong>on</strong> type could indicate thatc<strong>on</strong>structed-resp<strong>on</strong>se questi<strong>on</strong>s are more difficult for all students, and thataccommodati<strong>on</strong>s remove barriers to <strong>the</strong>se questi<strong>on</strong>s that are not present inmultiple-choice questi<strong>on</strong>s. These findings reinforce <strong>the</strong> noti<strong>on</strong> that research <strong>on</strong>testing accommodati<strong>on</strong>s must take an individual perspective, and that all9


students in such research should take <strong>the</strong> tests in both accommodated andn<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>s, to determine whe<strong>the</strong>r accommodati<strong>on</strong>s truly helpperformance.In a study c<strong>on</strong>ducted by Elliott and Marquart (in press), <strong>the</strong> use of an“extended time” accommodati<strong>on</strong> <strong>on</strong> a ma<strong>the</strong>matics achievement test wasexamined. Elliott and Marquart predicted (a) that students with disabilities, butnot students without disabilities, would score significantly higher in <strong>the</strong> extendedtime c<strong>on</strong>diti<strong>on</strong> than in <strong>the</strong> standard time c<strong>on</strong>diti<strong>on</strong>, (b) that students with lowmath skills, but not students with higher math skills, would score significantlyhigher in <strong>the</strong> extended time c<strong>on</strong>diti<strong>on</strong>, and (c) that all student groups wouldperceive <strong>the</strong> extended time c<strong>on</strong>diti<strong>on</strong> as helpful in reducing anxiety by allowing<strong>the</strong>m to exhibit what <strong>the</strong>y know and increasing <strong>the</strong>ir motivati<strong>on</strong> to finish tests.Participants in <strong>the</strong> study included 69 eighth-grade students, 14 of <strong>the</strong>ir parents,and 7 of <strong>the</strong>ir teachers. Am<strong>on</strong>g <strong>the</strong> students, 23 were classified as havingdisabilities, 23 were classified as educati<strong>on</strong>ally at-risk in <strong>the</strong> area of ma<strong>the</strong>matics,and 23 were classified as students performing at grade level. Teachers completed<strong>the</strong> Academic Competence Evaluati<strong>on</strong> <strong>Scales</strong> (ACES) (DiPerna & Elliott, 2000), arating scale to classify students without disabilities as at-risk or performing atgrade level. Student participants completed <strong>the</strong> ma<strong>the</strong>matics test of <strong>the</strong>TerraNova CTBS Multiple Assessments (CTB/McGraw-Hill, 1994–2000), as wellas a survey about <strong>the</strong> effects of <strong>the</strong> extended time accommodati<strong>on</strong>. Each testingsessi<strong>on</strong> included students from each of <strong>the</strong> three groups. Elliott and Marquartrandomly assigned <strong>the</strong> order of c<strong>on</strong>diti<strong>on</strong>s (accommodated and n<strong>on</strong>accommodated)in which each student performed <strong>the</strong> test. When performing in <strong>the</strong> accommodatedc<strong>on</strong>diti<strong>on</strong>, students had up to 40 minutes to complete <strong>the</strong> test. When performingin <strong>the</strong> n<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>, students had 20 minutes to complete <strong>the</strong> test.Parents and teachers of students in <strong>the</strong> study also completed <strong>the</strong> survey about<strong>the</strong> effects of <strong>the</strong> extended time accommodati<strong>on</strong>.Elliott and Marquart (in press) found that <strong>the</strong> effect of <strong>the</strong> extended timeaccommodati<strong>on</strong> was not significant for students without disabilities, who yieldedan effect size of .34. The accommodati<strong>on</strong> was not significant for students withdisabilities ei<strong>the</strong>r, as <strong>the</strong>ir effect size was .26. The three groups (students withdisabilities, at-risk, and grade level) did not differ significantly in <strong>the</strong>ir amount ofchange between accommodated and n<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>s. When studentswithout disabilities were c<strong>on</strong>sidered as at-risk and grade level groups, <strong>the</strong>students in <strong>the</strong> at-risk group experienced an effect size of .48 betweenaccommodati<strong>on</strong> c<strong>on</strong>diti<strong>on</strong>s, and students in <strong>the</strong> grade level group experienced aneffect size of .20. However, according to <strong>the</strong> survey, most students perceived <strong>the</strong>following advantages in <strong>the</strong> extended time c<strong>on</strong>diti<strong>on</strong>: <strong>the</strong>y felt more comfortable,were more motivated, felt less frustrated, thought <strong>the</strong>y performed better, reported<strong>the</strong> test seemed easier, and overall preferred taking <strong>the</strong> test under <strong>the</strong> extendedtime c<strong>on</strong>diti<strong>on</strong>. Although most teachers held a similarly positive view of <strong>the</strong>extended time c<strong>on</strong>diti<strong>on</strong> (88% indicated that a score from an accommodated testwould be as valid as <strong>on</strong>e for <strong>the</strong> same test without accommodati<strong>on</strong>s) few parents(21%) shared that view. Many parents (43%), but no teachers, believed that <strong>the</strong>score from an accommodated test would be less valid, and some members fromboth groups (36% of parents and 12% of teachers) were uncertain. Mostmembers of each group (63% of teachers and 56% of parents) believed that ifaccommodati<strong>on</strong>s are used <strong>on</strong> a test, those accommodati<strong>on</strong>s should be reportedwith <strong>the</strong> test results.10


McKevitt and Elliott (in press) studied <strong>the</strong> effects of testing accommodati<strong>on</strong>s <strong>on</strong>standardized reading test scores and <strong>the</strong> c<strong>on</strong>sequences of using accommodati<strong>on</strong>s<strong>on</strong> score validity and teacher and student attitudes about testing. Whileread-aloud accommodati<strong>on</strong>s are c<strong>on</strong>sidered invalid by <strong>the</strong> testing policies in manystates, to date <strong>the</strong>re have been no published studies that actually analyzed <strong>the</strong>ireffects <strong>on</strong> reading test performance. To test <strong>the</strong> effects of <strong>the</strong> read-aloudaccommodati<strong>on</strong>, <strong>the</strong> reading performance of 79 eighth-grade students was tested<strong>on</strong> <strong>the</strong> TerraNova CTBS Multiple Assessments Reading Battery–Research Versi<strong>on</strong>(Form A) (CTB/McGraw Hill, 1999). Forty of those students were diagnosed withan educati<strong>on</strong>ally defined disability and received special educati<strong>on</strong> services in <strong>the</strong>area of reading and/or language arts. The o<strong>the</strong>r 39 students were generaleducati<strong>on</strong> students used for comparis<strong>on</strong> purposes. Four special educati<strong>on</strong> teachersand <strong>on</strong>e general educati<strong>on</strong> teacher participated by recommending testingaccommodati<strong>on</strong>s for <strong>the</strong>se students using <strong>the</strong> Assessment <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Checklist (Elliott, Kratochwill, & Schulte, 1999). They also rated students’ readingachievement levels using <strong>the</strong> Academic Competence Evaluati<strong>on</strong> <strong>Scales</strong> (DiPerna &Elliott, 2000). An additi<strong>on</strong>al 43 teachers and all tested students completed surveysabout <strong>the</strong>ir percepti<strong>on</strong>s of and attitudes about testing accommodati<strong>on</strong>s andstandardized testing.Once students were identified, <strong>the</strong>y were divided into two groups (studentswith disabilities and students without disabilities). Within those groups, studentswere <strong>the</strong>n divided into two test c<strong>on</strong>diti<strong>on</strong>s (students receiving teacher-recommendedaccommodati<strong>on</strong>s and students receiving teacher-recommended accommodati<strong>on</strong>splus a read-aloud accommodati<strong>on</strong>). Students in each group and each c<strong>on</strong>diti<strong>on</strong>completed two alternate parts of <strong>the</strong> reading test—<strong>on</strong>e with accommodati<strong>on</strong>s(ei<strong>the</strong>r teacher-recommended accommodati<strong>on</strong>s or teacher-recommendedaccommodati<strong>on</strong>s plus read-aloud accommodati<strong>on</strong>s) and <strong>the</strong> o<strong>the</strong>r withoutaccommodati<strong>on</strong>s. The part of <strong>the</strong> test that was accommodated was determined byrandom assignment. This design yielded a repeated measures ANOVA with effectsize calculati<strong>on</strong>s used to test <strong>the</strong> predicti<strong>on</strong>s. The results showed that teachersselected accommodati<strong>on</strong>s <strong>the</strong>y c<strong>on</strong>sidered valid and fair for use <strong>on</strong> a standardizedtest. They did not recommend using a read-aloud accommodati<strong>on</strong>, as thisaccommodati<strong>on</strong> would interfere with <strong>the</strong> purpose of <strong>the</strong> test (i.e., to measurereading ability) and thus would invalidate resulting test scores. Next, <strong>the</strong>accommodati<strong>on</strong>s that teachers recommended did not significantly affect testscores for ei<strong>the</strong>r group of students. However, <strong>the</strong> read-aloud accommodati<strong>on</strong>, whenused in additi<strong>on</strong> to those recommended by <strong>the</strong> teacher, did positively andsignificantly affect test scores for both groups of students. There was nodifferential benefit from <strong>the</strong> read-aloud accommodati<strong>on</strong>, indicating overall scoreboosts for both groups of students, ra<strong>the</strong>r than <strong>the</strong> boost <strong>on</strong>ly for students withdisabilities that was predicted.Interestingly, <strong>the</strong>re was much individual variability in <strong>the</strong> accommodati<strong>on</strong>effects. As indicated by effect size statistics, <strong>the</strong> accommodati<strong>on</strong>s positivelyaffected <strong>the</strong> scores for half of all students with disabilities and 38% of all studentswithout disabilities. Fur<strong>the</strong>rmore, nei<strong>the</strong>r group of students scored significantlyhigher when <strong>the</strong> test was read aloud to <strong>the</strong>m as compared to <strong>the</strong> groups thatreceived o<strong>the</strong>r accommodati<strong>on</strong>s. While <strong>the</strong> read-aloud method helped both groupscompared to <strong>the</strong>ir own performance without accommodati<strong>on</strong>s, using <strong>the</strong> read-aloudmethod did not produce a significant effect when groups receiving <strong>the</strong> methodwere compared to those receiving <strong>on</strong>ly <strong>the</strong> teacher-recommended accommodati<strong>on</strong>s.11


Finally, McKevitt and Elliott (in press) found that students and teachers hadmixed feelings about <strong>the</strong> accommodati<strong>on</strong>s. Students were generally positive about<strong>the</strong>ir use, but <strong>the</strong>y expressed some c<strong>on</strong>cern that <strong>the</strong> read-aloud accommodati<strong>on</strong>was too difficult to follow. Likewise, teachers felt positive about <strong>the</strong> use ofaccommodati<strong>on</strong>s for students with disabilities but also were c<strong>on</strong>cerned about howaccommodati<strong>on</strong>s would affect test score validity. Teachers reported <strong>the</strong>y relyprimarily <strong>on</strong> professi<strong>on</strong>al judgment when making accommodati<strong>on</strong>s decisi<strong>on</strong>s,ra<strong>the</strong>r than <strong>on</strong> <strong>the</strong>ir own empirical testing of accommodati<strong>on</strong>s effects. Therefore, itis important to ensure teachers are knowledgeable about <strong>the</strong> use and effects oftesting accommodati<strong>on</strong>s.In summary, <strong>the</strong> McKevitt and Elliott (in press) study c<strong>on</strong>tributed to <strong>the</strong>increasing evidence that accommodati<strong>on</strong>s may have ei<strong>the</strong>r positive or negativeeffects for individual students with and without disabilities. It also lendssupport to <strong>the</strong> popular belief that reading aloud a reading test to students asan accommodati<strong>on</strong> invalidates test scores. The lack of differential boost (i.e.,<strong>the</strong> finding that both groups of students profited from a read-aloudaccommodati<strong>on</strong>) observed in <strong>the</strong> study is <strong>on</strong>e piece of evidence that <strong>the</strong>read-aloud accommodati<strong>on</strong> has an invalidating effect. But <strong>the</strong> lack ofdifferential benefit al<strong>on</strong>e may not be sufficient to c<strong>on</strong>clude invalidity of scoresresulting from <strong>the</strong> use of accommodati<strong>on</strong>s. In <strong>the</strong> case of <strong>the</strong> students receiving<strong>the</strong> teacher-recommended accommodati<strong>on</strong>s al<strong>on</strong>e, a differential boost also wasnot observed, and scores did not improve significantly for ei<strong>the</strong>r group.However, a c<strong>on</strong>clusi<strong>on</strong> cannot be drawn just by this evidence that <strong>the</strong>accommodati<strong>on</strong>s were invalid. The accommodati<strong>on</strong>s still may have served toremove a disability-related barrier for <strong>the</strong> student tested but may not have hada significant effect <strong>on</strong> scores. Thus, evidence to support <strong>the</strong> validity ofaccommodati<strong>on</strong>s needs to come from multiple sources, examining studentfactors, test factors, and <strong>the</strong> accommodati<strong>on</strong>s <strong>the</strong>mselves.Clinical Assessment ResearchPrevious versi<strong>on</strong>s of <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> were popular with clinicians forassessing <strong>the</strong> intelligence of clients with disabilities (e.g., Braen & Masling,1959). Hayes (1950) recommended a set of adaptati<strong>on</strong>s for clients with visualimpairments, which became known as <strong>the</strong> Perkins-<strong>Binet</strong> (see Genshaft & Ward,1982 and Ward & Genshaft, 1983 for reviews). Sattler (1972a, 1972b) c<strong>on</strong>ductedresearch with <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> with children who had mental retardati<strong>on</strong>,cerebral palsy, speech difficulties, and no known disabilities. He c<strong>on</strong>cluded that<strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> was useful for assessing <strong>the</strong> intelligence of <strong>the</strong>se individuals.Sattler also recommended specific alterati<strong>on</strong>s in procedures and materials(i.e., accommodati<strong>on</strong>s and modificati<strong>on</strong>s) for low-performing and young children,but he noted that such modificati<strong>on</strong>s were inappropriate for higher-performingchildren. One outcome of this research (reviewed by Harringt<strong>on</strong>, 1979) was torecommend a n<strong>on</strong>verbal set of tests selected from <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> andWechsler <strong>Scales</strong> for low-performing children with disabilities. Katz (1956, 1958)recommended examiners use a modified pointing procedure in which clients wereallowed to direct <strong>the</strong>ir movements through commands and gestures, as anaccommodati<strong>on</strong> for testing. Finally, Bloom, Klee, and Raskin (1977) investigated<strong>the</strong> impact of abbreviated and complete versi<strong>on</strong>s of <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> with12


children with developmental delays, and c<strong>on</strong>cluded that abbreviated forms,although correlating well with complete forms, missed essential features ofperformance that could lead to misclassificati<strong>on</strong>. Such findings probablyinfluenced <strong>the</strong> design of <strong>the</strong> SB5 (Roid, 2003a), in which each of <strong>the</strong> five factorsmeasured by <strong>the</strong> instrument incorporates both a verbal and a n<strong>on</strong>verbalsubscale, resulting in <strong>the</strong> n<strong>on</strong>verbal index providing a comprehensive, compositemeasure of ability.Unfortunately, most of this research is descriptive, with little effort to validateassessment accommodati<strong>on</strong>s and modificati<strong>on</strong>s through experimental means.Excepti<strong>on</strong>s to this c<strong>on</strong>clusi<strong>on</strong> include work by Sattler (1972a, 1972b) and Handy(1996). Sattler included multiple measures of intelligence, and Handysystematically varied pantomimed and standardized administrati<strong>on</strong> of <strong>the</strong><strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> Scale: Fourth Editi<strong>on</strong> (Thorndike, Hagen, & Sattler,1986) and <strong>the</strong> Wechsler <strong>Scales</strong> am<strong>on</strong>g both students with mild hearingimpairments and those with no impairments. Both <strong>the</strong>se researchers c<strong>on</strong>cludedthat accommodati<strong>on</strong>s made small differences in test outcomes, although wheredifferences occurred, scores were higher in accommodated versusn<strong>on</strong>accommodated c<strong>on</strong>diti<strong>on</strong>s.More comm<strong>on</strong>ly, researchers have c<strong>on</strong>ducted a logical or rati<strong>on</strong>al analysis andsimply made recommendati<strong>on</strong>s. Occasi<strong>on</strong>ally recommendati<strong>on</strong>s are supplementedby reporting whe<strong>the</strong>r changes in test c<strong>on</strong>diti<strong>on</strong>s affect test scores. Theseprocedures are inadequate for validating accommodati<strong>on</strong>s, as <strong>the</strong> effect of suchchanges <strong>on</strong> n<strong>on</strong>disabled populati<strong>on</strong>s are not known. Also, <strong>the</strong> literature <strong>on</strong>assessment accommodati<strong>on</strong>s in intellectual assessment uniformly ignores <strong>the</strong> needto retain c<strong>on</strong>struct representati<strong>on</strong>. This is problematic, as many recommendedaccommodati<strong>on</strong>s and practices involve eliminating items or subtests from clinicalbatteries (e.g., Brauer, Braden, Pollard, & Hardy-Braz, 1997). The lack of attenti<strong>on</strong>to c<strong>on</strong>struct representati<strong>on</strong> has been exacerbated by poor specificati<strong>on</strong> of cognitiveabilities and <strong>the</strong> assumpti<strong>on</strong> that general intellectual abilities are adequatelyrepresented by virtually any porti<strong>on</strong> of an intelligence test battery (McGrew,Keith, Flanagan, & Vanderwood, 1997). However, as test publishers and cliniciansincreasingly understand and apply cognitive frameworks, such as <strong>the</strong> CHC <strong>the</strong>orythat drives <strong>the</strong> SB5, it will be possible to more accurately specify <strong>the</strong> c<strong>on</strong>structstests intend to measure and ensure accommodati<strong>on</strong>s adequately retain c<strong>on</strong>structrepresentati<strong>on</strong> (Braden, in press).Identifying Appropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g><strong>on</strong> <strong>the</strong> SB5Making decisi<strong>on</strong>s about appropriate accommodati<strong>on</strong>s requires professi<strong>on</strong>aljudgments. These judgments are influenced directly by knowledge of <strong>the</strong> client,recogniti<strong>on</strong> of general access skills for most tests, knowledge of <strong>the</strong> c<strong>on</strong>tent anddemand characteristics of <strong>the</strong> test and respective subtests, and a well-groundedunderstanding of test score validity. In some cases, it is <strong>on</strong>ly as a result of testinga client that an examiner becomes aware of key factors that influence accommodati<strong>on</strong>decisi<strong>on</strong>s. However, with some planning and knowledge of <strong>the</strong> role that testaccommodati<strong>on</strong>s can play in testing, assessment results can be more meaningful.13


Know <strong>the</strong> ClientExaminers should become familiar with <strong>the</strong> client’s physical, emoti<strong>on</strong>al, andcognitive abilities and disabilities prior to testing. The SB5 is designed to measurea wide range of cognitive abilities but not physical or emoti<strong>on</strong>al abilities. Ofcourse, <strong>the</strong>se domains of abilities are correlated to some degree. Therefore, it isquite challenging for an examiner to select appropriate accommodati<strong>on</strong>s. Thischallenge is reduced, however, when <strong>the</strong> examiner observes <strong>the</strong> client in his or hernatural setting (school, work, etc.) and interacts with <strong>the</strong> client before testing toget a clear understanding of his or her motor and sensory skills and ability toregulate emoti<strong>on</strong>s. Examiners should also interview o<strong>the</strong>rs who know <strong>the</strong> clientwell to gain a good understanding of client attributes and essential skills neededto meaningfully access and resp<strong>on</strong>d to test items.Recognize Key Access SkillsIn <strong>the</strong>ory, <strong>the</strong>re are hundreds of potential skills needed to access and resp<strong>on</strong>d toan item <strong>on</strong> a test like <strong>the</strong> SB5. Based <strong>on</strong> a comprehensive understanding of <strong>the</strong>c<strong>on</strong>structs targeted by <strong>the</strong> SB5, <strong>the</strong>re are a number of general access skills thatclients need to facilitate accurate measurement of <strong>the</strong>ir cognitive abilities. Thesecan include:■■■■■■■■■■■■■AttendingListening to and understanding languageSeeingSitting still for an extended period of timeReadingWritingFollowing directi<strong>on</strong>sManipulating materialsTracking examiner’s movements and related materialsProcessing informati<strong>on</strong> in a timely mannerWorking for a sustained period of timeCommunicating pers<strong>on</strong>al needsAsking questi<strong>on</strong>s when <strong>the</strong>y do not understand.Whe<strong>the</strong>r or not a skill is an access skill is determined by <strong>the</strong> purpose of <strong>the</strong>test. In o<strong>the</strong>r words, some skills may be access skills for <strong>on</strong>e subtest but not forano<strong>the</strong>r subtest because that subtest includes <strong>the</strong> skill or c<strong>on</strong>struct in <strong>the</strong>c<strong>on</strong>struct it intends to assess (i.e., target skills).Know <strong>the</strong> Test C<strong>on</strong>tent and Administrati<strong>on</strong> ProceduresThere is no substituti<strong>on</strong> for knowing <strong>the</strong> c<strong>on</strong>tent of a test when making decisi<strong>on</strong>sabout appropriate accommodati<strong>on</strong>s. If examiners have a comprehensiveunderstanding of <strong>the</strong> knowledge and skills targeted by <strong>the</strong> test, <strong>the</strong>y can deducemany of <strong>the</strong> necessary skills needed to access and resp<strong>on</strong>d to <strong>the</strong> test c<strong>on</strong>tent.14


Familiarity with <strong>the</strong> test and <strong>the</strong> related underlying cognitive abilities measuredby <strong>the</strong> test provides <strong>the</strong> foundati<strong>on</strong> for making professi<strong>on</strong>al judgments aboutappropriate accommodati<strong>on</strong>s for any client. An initial analysis of <strong>the</strong> cognitiveabilities targeted by SB5 scales and subtests is included in this bulletin al<strong>on</strong>gwith suggesti<strong>on</strong>s for appropriate accommodati<strong>on</strong>s. Inappropriate modificati<strong>on</strong>s to<strong>the</strong> subtests are also identified.Built-In <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>By incorporating adaptive testing, extra time, and <strong>the</strong> availability of verbal vs.n<strong>on</strong>verbal scales, many c<strong>on</strong>cerns regarding accommodati<strong>on</strong> have already beenaddressed and built into <strong>the</strong> administrati<strong>on</strong> procedures of <strong>the</strong> SB5. In adaptivetesting, <strong>the</strong> examiner gives items that are relevant to <strong>the</strong> particular examinee andavoids items irrelevant to estimating <strong>the</strong> examinee’s ability level (i.e., items thatare too difficult or too easy). The SB5 incorporates adaptive testing by usingrouting tests so that subsequent testing is adapted to <strong>the</strong> examinee’s ability level.Extra time is perhaps <strong>the</strong> most comm<strong>on</strong> assessment accommodati<strong>on</strong> (Elliott,Braden, & White, 2001). Fortunately, few SB5 subtests have any time limits, andmost time limits are merely to avoid examinee frustrati<strong>on</strong> and fatigue. Theexaminer can generally ignore those time limits when judgment indicates that <strong>the</strong>benefits to <strong>the</strong> examinee exceed <strong>the</strong> costs of fatigue or lost attenti<strong>on</strong>. Finally, <strong>the</strong>availability of language-loaded (verbal) and language-reduced (n<strong>on</strong>verbal) scalesthat comprise all five of <strong>the</strong> cognitive factors assessed <strong>on</strong> <strong>the</strong> SB5 allow examinersto adapt <strong>the</strong> assessment to different language abilities without significantlysacrificing c<strong>on</strong>struct representati<strong>on</strong>. For example, most o<strong>the</strong>r cognitive abilitybatteries use exclusively verbal subtests to measure Knowledge and exclusivelyn<strong>on</strong>verbal subtests to measure Fluid Reas<strong>on</strong>ing. On <strong>the</strong>se batteries, examinersmust decide whe<strong>the</strong>r to omit <strong>the</strong>se tests for some clients (e.g., clients who aredeaf, hard of hearing, or visually impaired) and sacrifice c<strong>on</strong>struct representati<strong>on</strong>,or whe<strong>the</strong>r to include <strong>the</strong>m and c<strong>on</strong>sequently threaten results due toc<strong>on</strong>struct-irrelevant variance. The SB5 provides examiners with <strong>the</strong> means to useei<strong>the</strong>r a language-reduced or language-loaded approach to testing withoutsacrificing c<strong>on</strong>struct representati<strong>on</strong>.Scale and Subtest-Specific Informati<strong>on</strong> to Guide<str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>This secti<strong>on</strong> provides specific suggesti<strong>on</strong>s for accommodati<strong>on</strong>s <strong>on</strong> <strong>the</strong> SB5.It is assumed that examiners will address n<strong>on</strong>cognitive access skills in SB5administrati<strong>on</strong>. For example, examiners routinely provide wheelchair-accessiblefurniture, allow communicati<strong>on</strong> devices (e.g., speech syn<strong>the</strong>sizers, keyboards) usedby examinees to express <strong>the</strong>mselves, provide adequate breaks or rest to reducephysical fatigue, and so forth. However, it is not possible to anticipate all of <strong>the</strong>physical accommodati<strong>on</strong>s that examinees might require, and examiners areencouraged to c<strong>on</strong>sider carefully whe<strong>the</strong>r <strong>the</strong> change in <strong>the</strong> test setting, timing,presentati<strong>on</strong>, or resp<strong>on</strong>se characteristics affects <strong>the</strong> target skills (c<strong>on</strong>structrepresentati<strong>on</strong>) of <strong>the</strong> examinati<strong>on</strong>. If not, examiners should be flexible, practical,and welcoming as <strong>the</strong>y c<strong>on</strong>sider assessment accommodati<strong>on</strong>s, while criticallyevaluating potential accommodati<strong>on</strong>s to ensure that <strong>the</strong>y do not invalidate15


Table 1N<strong>on</strong>verbal Subtest/Activity Target Skills, Appropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>, andInappropriate Modificati<strong>on</strong>sObject Series/MatricesTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sFluid reas<strong>on</strong>ing, inducti<strong>on</strong>, general sequential reas<strong>on</strong>ing, visual memory, visualizati<strong>on</strong> of abstractstimuli, attenti<strong>on</strong> to visual cues, c<strong>on</strong>centrati<strong>on</strong> for l<strong>on</strong>g periods, systematic visual scanning,search strategies, mental review of potential answers, visual discriminati<strong>on</strong>, tracking visualsequences, pattern recogniti<strong>on</strong>, and mental verbal mediati<strong>on</strong>Allow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se or vice versa to indicateselecti<strong>on</strong>, allow any modality for presenting directi<strong>on</strong>s (signing, writing, speaking)Allow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), reduce number of alternatives for any item, create manipulative materials for matrixitemsProcedural KnowledgeTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sPicture AbsurditiesTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sN<strong>on</strong>verbal Quantitative Reas<strong>on</strong>ingTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sForm BoardTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sCrystallized abilities/knowledge, general informati<strong>on</strong>, oral producti<strong>on</strong> and fluency, visualizati<strong>on</strong>of meaningful stimuli, freedom from visual neglect, tolerati<strong>on</strong> of ambiguity, search strategies,and visual discriminati<strong>on</strong>Allow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se to indicate selecti<strong>on</strong>, allow anymodality for presenting directi<strong>on</strong>s (signing, writing, speaking), allow tactile c<strong>on</strong>tact with materials,allow examinee to “guide” examiner to resp<strong>on</strong>d (but do not suggest alternatives)Allow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), offer examinee a finite set of alternatives (e.g., “This way or that way?”)Crystallized abilities/knowledge, general informati<strong>on</strong>, oral producti<strong>on</strong> and fluency, visualizati<strong>on</strong>of meaningful stimuli, freedom from visual neglect, tolerati<strong>on</strong> of ambiguity, search strategies,and visual discriminati<strong>on</strong>Allow extra time, allow motor/gestured or vocal resp<strong>on</strong>se (i.e., do not require both gestures andvocalizati<strong>on</strong> as l<strong>on</strong>g as <strong>the</strong> examinee provides unambiguous, scoreable resp<strong>on</strong>ses), allow anymodality for presenting directi<strong>on</strong>s (signing, writing, speaking) that does not describe <strong>the</strong> problemdepicted in <strong>the</strong> itemAllow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), use n<strong>on</strong>standard query (e.g., “Yes, that’s silly, but <strong>the</strong>re’s something even sillier.”)Quantitative reas<strong>on</strong>ing, ma<strong>the</strong>matical knowledge, c<strong>on</strong>centrati<strong>on</strong> for l<strong>on</strong>g periods, andproducti<strong>on</strong> of c<strong>on</strong>venti<strong>on</strong>al answersAllow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se or vice versa to indicateselecti<strong>on</strong>, allow any modality for presenting directi<strong>on</strong>s (signing, writing, speaking) but do not usegestures that clearly indicate correct answer (e.g., bigger must be represented symbolically viavoice, print, or sign), allow examinee to vocalize or “guide” examiner to resp<strong>on</strong>d (but do notsuggest alternatives), provide scratch paper or calculator to assist in mental calculati<strong>on</strong>sAllow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), reduce number of alternatives for any item, provide a calculator for calculati<strong>on</strong>sVisual-spatial processing, spatial relati<strong>on</strong>s, visualizati<strong>on</strong> of meaningful stimuli, inspecti<strong>on</strong> ofobjects by touch, freedom from visual neglect, precisi<strong>on</strong> of movement, tracking visual sequences,and retenti<strong>on</strong> spanAllow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se or vice versa to indicateselecti<strong>on</strong>, allow any modality for presenting directi<strong>on</strong>s (signing, writing, speaking), allow examineeto “guide” examiner to resp<strong>on</strong>d (but do not suggest alternatives)Allow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), reduce <strong>the</strong> number of items pieces (i.e., presentati<strong>on</strong> in sequence ra<strong>the</strong>r than allat <strong>on</strong>ce)17


Table 1 (C<strong>on</strong>tinued)N<strong>on</strong>verbal Subtest/Activity Target Skills, Appropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>, andInappropriate Modificati<strong>on</strong>sForm PatternsTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sVisual-spatial processing, spatial relati<strong>on</strong>s, closure speed, visualizati<strong>on</strong> of meaningful stimuli,inspecti<strong>on</strong> of objects by touch, freedom from visual neglect, precisi<strong>on</strong> of movement, trackingvisual sequences, and retenti<strong>on</strong> spanAllow extra time to execute resp<strong>on</strong>se, adjust precisi<strong>on</strong> rules for scoring if examiner c<strong>on</strong>cludesexaminee’s motor skills (not visual-spatial processing) caused poor placement of pieces, allowexaminee to vocalize or “guide” examiner to resp<strong>on</strong>d (but do not suggest alternatives)Allow extra time to c<strong>on</strong>sider or plan resp<strong>on</strong>se, provide opti<strong>on</strong>s and ask examinee to select “best”resp<strong>on</strong>se, allow drawing resp<strong>on</strong>seDelayed Resp<strong>on</strong>seTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sBlock SpanTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sWorking memory, memory span, impulse c<strong>on</strong>trol, freedom from distractibility, patience withdifficult tasks, speed of movement, precisi<strong>on</strong> of movement, tracking visual sequences, andretenti<strong>on</strong> spanAllow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se or vice versa to indicateselecti<strong>on</strong>, allow any modality for presenting directi<strong>on</strong>s (signing, writing, speaking)Allow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items as indicated)Working memory, memory span, visual memory, serial percepti<strong>on</strong>, impulse c<strong>on</strong>trol, freedom fromdistractibility, patience with difficult tasks, speed of movement, precisi<strong>on</strong> of movement, trackingvisual sequences, and retenti<strong>on</strong> spanAllow extra time, allow vocal resp<strong>on</strong>se in lieu of motor resp<strong>on</strong>se or vice versa to indicateselecti<strong>on</strong>, allow any modality for presenting directi<strong>on</strong>s (signing, writing, speaking), adjustprecisi<strong>on</strong> rules for tapping if examiner c<strong>on</strong>cludes examinee’s motor skills (not working memory)caused misplaced strike, allow examinee to vocalize or “guide” examiner to resp<strong>on</strong>d (but do notsuggest alternatives)Allow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback <strong>on</strong> errors (except <strong>on</strong> starting items asindicated), repeat items, slow or speed up presentati<strong>on</strong>, vocalize block numbers or provide o<strong>the</strong>rauditory or linguistic cuesNote. Italicized target skills are drawn from Stratum I narrow cognitive abilities listed in Table 1.3 of <strong>the</strong> SB5 Interpretive Manual (Roid, 2003c). Unitalicized target skills aredrawn from cognitive abilities (processes) listed in Table 1.5 of <strong>the</strong> same manual.Table 2Verbal Subtest/Activity Target Skills, Appropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>, and InappropriateModificati<strong>on</strong>sEarly Reas<strong>on</strong>ingTarget SkillsFluid reas<strong>on</strong>ing, general sequential reas<strong>on</strong>ing, oral producti<strong>on</strong> and fluency, visual memory, attenti<strong>on</strong>to verbal cues, mental review of potential answers, verbal fluency, rapid retrieval of words andexplanati<strong>on</strong>s, and producti<strong>on</strong> of creative answersAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g> Allow signed, oral, and written presentati<strong>on</strong> of directi<strong>on</strong>s and examinee resp<strong>on</strong>ses; Level 2—Accept gestured resp<strong>on</strong>ses that clearly explain/extend (ra<strong>the</strong>r than merely represent) <strong>the</strong> picture;Level 3—Allow examinee to identify groups of three using words or pointing if accompanied byverbal or abstract (not necessarily vocal) descripti<strong>on</strong> of categoryInappropriate Modificati<strong>on</strong>sAllow additi<strong>on</strong>al or n<strong>on</strong>standard cuing or feedback (except as indicated), Level 2—Accept asimple representati<strong>on</strong> (e.g., drawing, gestured tableau) of stimulus (ra<strong>the</strong>r than explanati<strong>on</strong> orextensi<strong>on</strong>), Level 3—Accept clusters without explicitly naming or indicating <strong>the</strong> characteristic thatmembers share within groupsVerbal AbsurditiesTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sFluid reas<strong>on</strong>ing, inducti<strong>on</strong>, attenti<strong>on</strong> to verbal cues, mental review of potential answers, verbalfluency, rapid retrieval of words and explanati<strong>on</strong>s, and producti<strong>on</strong> of creative answersAllow signed, oral, and written presentati<strong>on</strong> of directi<strong>on</strong>s and examinee resp<strong>on</strong>ses; allow gesturedquery by <strong>the</strong> examiner to <strong>the</strong> examinee (e.g., folds hands, brings hands to body to encourage anelaborated resp<strong>on</strong>se)Gesture to present items, represent items as pictures, provide finite resp<strong>on</strong>ses and inviteexaminee to select resp<strong>on</strong>se18


Table 2 (C<strong>on</strong>tinued)Verbal Subtest/Activity Target Skills, Appropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>, and InappropriateModificati<strong>on</strong>sVerbal AnalogiesTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sVocabularyTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sVerbal Quantitative Reas<strong>on</strong>ingTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sPositi<strong>on</strong> and Directi<strong>on</strong>Target SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sMemory for SentencesTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sLast WordTarget SkillsAppropriate <str<strong>on</strong>g>Accommodati<strong>on</strong>s</str<strong>on</strong>g>Inappropriate Modificati<strong>on</strong>sFluid reas<strong>on</strong>ing, attenti<strong>on</strong> to verbal cues, mental review of potential answers, verbal fluency,rapid retrieval of words and explanati<strong>on</strong>s, and producti<strong>on</strong> of creative answersAllow signed directi<strong>on</strong>s, items, and resp<strong>on</strong>sesGesture to present items, represent items as pictures, provide finite resp<strong>on</strong>ses and inviteexaminee to select resp<strong>on</strong>seCrystallized abilities/knowledge, lexical knowledge, language development, fund of generalinformati<strong>on</strong>, and rapid retrieval of words and explanati<strong>on</strong>sAllow signed or written directi<strong>on</strong>s, items, and resp<strong>on</strong>ses; allow gestured resp<strong>on</strong>ses that explain orextend picture itemsAllow gestured resp<strong>on</strong>ses that simply mimic or copy correct answers (e.g., “Put your finger <strong>on</strong>your nose.”), provide syn<strong>on</strong>yms or act out <strong>the</strong> meaning of words, translate words to more or lesscomm<strong>on</strong> alternatives (e.g., such as through signing), provide finite resp<strong>on</strong>ses and invite examineeto select resp<strong>on</strong>seQuantitative reas<strong>on</strong>ing, ma<strong>the</strong>matical knowledge, c<strong>on</strong>centrati<strong>on</strong> for l<strong>on</strong>g periods, and producti<strong>on</strong>of c<strong>on</strong>venti<strong>on</strong>al answersAllow extra time, allow any modality for presenting directi<strong>on</strong>s and for resp<strong>on</strong>ding (signing, writing,speaking), allow examinee to tap (<strong>the</strong> number of objects) or gesture (hold up fingers, use hashmarks), or write answers to counting items, provide scratch paper to assist in mental calculati<strong>on</strong>sAllow examinee to use calculator for arithmetic items, create visual cues to reduce ma<strong>the</strong>maticalcomplexity (e.g., drawing out story problems, writing down key facts, providing partial soluti<strong>on</strong>),provide scratch paper for visual items unless o<strong>the</strong>rwise allowedVisual-spatial processing, visualizati<strong>on</strong>, attenti<strong>on</strong> to verbal cues, and recogniti<strong>on</strong> and evaluati<strong>on</strong> of partsAllow signed or written directi<strong>on</strong>s, items, and resp<strong>on</strong>ses; use simplified directi<strong>on</strong>s that emphasizekey phrases or words (e.g., <strong>on</strong>, bottom); allow vocal or pointing resp<strong>on</strong>se; allow examinee to“guide” examiner to resp<strong>on</strong>d (but do not suggest alternatives)Allow gestured resp<strong>on</strong>ses that simply mimic or copy correct answers (e.g., “Put <strong>the</strong> ball <strong>on</strong> <strong>the</strong>table.”), provide syn<strong>on</strong>yms, act out <strong>the</strong> meaning of words, use gestured presentati<strong>on</strong>s that indicate<strong>the</strong> correct placement or resp<strong>on</strong>se, accept pointing resp<strong>on</strong>se to indicate routes <strong>on</strong> a map in lieuof directi<strong>on</strong>s (i.e., answers must be given from <strong>the</strong> perspective of <strong>the</strong> people in <strong>the</strong> map, not <strong>the</strong>map reader), provide pencil and paper to allow trial and error resp<strong>on</strong>ses to last two itemsWorking memory, memory span, language development, impulse c<strong>on</strong>trol, freedom from distractibility,patience with difficult tasks, wide auditory attenti<strong>on</strong> span, and retenti<strong>on</strong> spanCue examinee that item is about to be presented; allow time to resp<strong>on</strong>d; allow resp<strong>on</strong>ses inwriting or signs (<strong>on</strong>ly after <strong>the</strong> item is presented); present items in signs while using a single,unique sign for each word in <strong>the</strong> sentencePresent items in writing, repeat items, segment items into phrases, suggest or model retenti<strong>on</strong>strategies, allow examinee to write or record items during item presentati<strong>on</strong>Working memory, memory span, language development, impulse c<strong>on</strong>trol, freedom from distractibility,patience with difficult tasks, wide auditory attenti<strong>on</strong> span, and retenti<strong>on</strong> spanCue examinee that item is about to be presented; allow time to resp<strong>on</strong>d; allow resp<strong>on</strong>ses inwriting or signs (<strong>on</strong>ly after <strong>the</strong> item is presented); present items in signs while using a single,unique sign for <strong>the</strong> last word in each questi<strong>on</strong>Present items in writing, repeat questi<strong>on</strong>s, emphasize last word in each questi<strong>on</strong>, suggest ormodel retenti<strong>on</strong> strategies, advise examinees to ignore questi<strong>on</strong>, allow examinee to write orrecord items during item presentati<strong>on</strong>Note. Italicized target skills are drawn from Stratum I narrow cognitive abilities listed in Table 1.3 of <strong>the</strong> SB5 Interpretive Manual (Roid, 2003c). Unitalicized target skills aredrawn from cognitive abilities (processes) listed in Table 1.5 of <strong>the</strong> same manual.19


ReferencesAmerican Educati<strong>on</strong>al Research Associati<strong>on</strong>, American Psychological Associati<strong>on</strong>, & Nati<strong>on</strong>al Council <strong>on</strong>Measurement in Educati<strong>on</strong> (AERA, APA, & NCME). (1999). Standards for educati<strong>on</strong>al and psychologicaltesting (3rd ed.). Washingt<strong>on</strong>, DC: American Educati<strong>on</strong>al Research Associati<strong>on</strong>.American Psychological Associati<strong>on</strong>. (2002). Ethical principles of psychologists and code of c<strong>on</strong>duct. AmericanPsychologist, 57, 1060–1073.Bloom, A. S., Klee, S. H., & Raskin, L. M. (1977). A comparis<strong>on</strong> of <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> abbreviated and completeforms for developmentally disabled children. Journal of Clinical Psychology, 33(2), 477–480.Braden, J. P. (in press). Accommodating clients with disabilities <strong>on</strong> <strong>the</strong> WAIS-III/WMS. In D. Saklofske & D.Tulsky (Eds.), Use of <strong>the</strong> WAIS-III/WMS in clinical practice (pp. 451–486). New York: Hought<strong>on</strong>-Mifflin.Braen, B. B., & Masling, J. M. (1959). <strong>Intelligence</strong> tests used with special groups of children. Excepti<strong>on</strong>alChildren, 26, 42–45.Brauer, B. A., Braden, J. P., Pollard, R. Q., & Hardy-Braz, S. T. (1997). Deaf and hard of hearing people. InJ. Sandoval, C. L. Frisby, K. F. Geisinger, J. D. Scheuneman, & J. R. Grenier (Eds.), Test interpretati<strong>on</strong> anddiversity: Achieving equity in assessment (pp. 297–315). Washingt<strong>on</strong>, DC: American PsychologicalAssociati<strong>on</strong>.Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. Cambridge, England:Cambridge University Press.CTB/McGraw-Hill. (1994–2000). TerraNova CTBS Multiple Assessments. M<strong>on</strong>terey, CA: Author.CTB/McGraw-Hill. (1999). CTBS Multiple Assessments Reading Battery–Research Versi<strong>on</strong> (Form A).M<strong>on</strong>terey, CA: Author.DiPerna, J. C., & Elliott, S. N. (2000). Academic Competence Evaluati<strong>on</strong> <strong>Scales</strong> (ACES). San Ant<strong>on</strong>io: ThePsychological Corporati<strong>on</strong>.Elliott, S. N., Braden, J. P., & White, J. L. (2001). Assessing <strong>on</strong>e and all: Educati<strong>on</strong>al accountability for studentswith disabilities. Rest<strong>on</strong>, VA: Council for Excepti<strong>on</strong>al Children.Elliott, S. N., Kratochwill, T. R., & McKevitt, B. C. (2001). Experimental analysis of <strong>the</strong> effects of testingaccommodati<strong>on</strong>s <strong>on</strong> <strong>the</strong> scores of students with and without disabilities. Journal of School Psychology,39(1), 3–24.Elliott, S. N., Kratochwill, T. R., & Schulte, A. G. (1999). Assessment accommodati<strong>on</strong>s checklist. M<strong>on</strong>terey, CA:CTB/McGraw Hill.Elliott, S. N., & Marquart, A. M. (in press). Extended time as a testing accommodati<strong>on</strong>: Its effects andperceived c<strong>on</strong>sequences. Excepti<strong>on</strong>al Children.Genshaft, J., & Ward, M. E. (1982). A review of <strong>the</strong> Perkins-<strong>Binet</strong> Tests of <strong>Intelligence</strong> for <strong>the</strong> Blind withsuggesti<strong>on</strong>s for administrati<strong>on</strong>. School Psychology Review, 11(3), 338–341.Handy, L. A. (1996). Pantomime administrati<strong>on</strong> of <strong>the</strong> WISC-III and SB: FE to hearing and otitis pr<strong>on</strong>e NativeIndian students. Dissertati<strong>on</strong> Abstracts Internati<strong>on</strong>al Secti<strong>on</strong> A: Humanities and Social Sciences,56(8-A), 3058.Harringt<strong>on</strong>, R. G. (1979). A review of Sattler’s modificati<strong>on</strong>s of standard intelligence tests for use withhandicapped children. School Psychology Digest, 8(3), 296–302.Hayes, S. P. (1950). Measuring <strong>the</strong> intelligence of <strong>the</strong> blind. In P. A. Zahl (Ed.), Blindness: Modern approachesto <strong>the</strong> unseen envir<strong>on</strong>ment (pp. 141–173). Oxford, UK: Princet<strong>on</strong> University Press.20


Katz, E. (1956). The pointing scale method: A modificati<strong>on</strong> of <strong>the</strong> <strong>Stanford</strong>-<strong>Binet</strong> procedure for use withcerebral palsied children. American Journal of Mental Deficiency, 60, 838–842.Katz, E. (1958). The “Pointing Modificati<strong>on</strong>” of <strong>the</strong> revised <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, forms L and M,years II through VI: A report of research in progress. American Journal of Mental Deficiency, 62, 698–707.McGrew, K. S., Keith, T. Z., Flanagan, D. P., & Vanderwood, M. (1997). Bey<strong>on</strong>d “g”: The impact of “Gf-Gc”specific cognitive abilities research <strong>on</strong> <strong>the</strong> future use and interpretati<strong>on</strong> of intelligence test batteries in<strong>the</strong> schools. School Psychology Review, 26(2), 189–210.McD<strong>on</strong>nell, L. M., McLaughlin, M. J., & Moris<strong>on</strong>, P. (Eds.) (1997). Educating <strong>on</strong>e and all: Students withdisabilities and standards-based reform. Washingt<strong>on</strong>, DC: Nati<strong>on</strong>al Academy Press.McKevitt, B. C., & Elliott, S. N. (in press). The effects and c<strong>on</strong>sequences of using testing accommodati<strong>on</strong>s <strong>on</strong> astandardized reading test. School Psychology Review.Messick, S. (1995). Validity of psychological assessment: Validati<strong>on</strong> of inferences from pers<strong>on</strong>s’ resp<strong>on</strong>ses andperformances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749.Nati<strong>on</strong>al Associati<strong>on</strong> of School Psychologists. (2000). Professi<strong>on</strong>al c<strong>on</strong>duct manual. Principles for professi<strong>on</strong>alethics: Guidelines for <strong>the</strong> provisi<strong>on</strong> of school psychological services. Retrieved July 1, 2003, fromhttp://ericcass.uncg.edu/virtuallib/assess/1023.pdfNati<strong>on</strong>al Council of Teachers of Ma<strong>the</strong>matics (1989). Curriculum and evaluati<strong>on</strong> standards for schoolma<strong>the</strong>matics. Rest<strong>on</strong>, VA: Author.Office of Civil Rights (2000). The use of tests as part of high-stakes decisi<strong>on</strong>-making for students: A resourceguide for educators and policy makers. Washingt<strong>on</strong>, DC: U.S. Department of Educati<strong>on</strong>. (Available athttp://www.ed.gov/offices/OCR/testing/index1.html)Phillips, S. E. (1993). Testing c<strong>on</strong>diti<strong>on</strong> accommodati<strong>on</strong>s for disabled students. West’s Educati<strong>on</strong> LawQuarterly, 2(2), 366–389.Phillips, S. E. (1994). High stakes testing accommodati<strong>on</strong>s: Validity versus disabled rights. AppliedMeasurement in Educati<strong>on</strong>, 7(2), 93–120.Roid, G. H. (2003a). <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong>. Itasca, IL: Riverside Publishing.Roid, G. H. (2003b). <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong>, examiner’s manual. Itasca, IL: RiversidePublishing.Roid, G. H. (2003c). <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong>, interpretive manual: Expanded guide to<strong>the</strong> interpretati<strong>on</strong> of SB5 test results. Itasca, IL: Riverside Publishing.Roid, G. H. (2003d). <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> <strong>Scales</strong>, Fifth Editi<strong>on</strong>, technical manual. Itasca, IL: RiversidePublishing.Sattler, J. M. (1972a). <strong>Intelligence</strong> test modificati<strong>on</strong>s <strong>on</strong> handicapped and n<strong>on</strong>handicapped children: Finalreport. (ERIC Document Reproducti<strong>on</strong> Service No. 095673). San Diego, CA: San Diego State UniversityFoundati<strong>on</strong>.Sattler, J. M. (1972b). <strong>Intelligence</strong> test modificati<strong>on</strong>s <strong>on</strong> handicapped and n<strong>on</strong>handicapped children:Supplement to final report. (ERIC Document Reproducti<strong>on</strong> Service No. 095674). San Diego, CA: San DiegoState University Foundati<strong>on</strong>.Schrank, F. A., & Flanagan, D. P. (Eds.). (2003). WJ III clinical use and interpretati<strong>on</strong>: Scientist-practiti<strong>on</strong>erperspectives. San Diego, CA: Academic Press.Schulte, A. G., Elliott, S. N., & Kratochwill, T. R. (2001). Experimental analysis of <strong>the</strong> effects of testingaccommodati<strong>on</strong>s <strong>on</strong> students’ standardized achievement test scores. School Psychology Review, 30(4), 527–547.21


Thorndike, R. L., Hagen, E. P., & Sattler, J. M. (1986). The <strong>Stanford</strong>-<strong>Binet</strong> <strong>Intelligence</strong> Scale: Fourth Editi<strong>on</strong>.Itasca, IL: Riverside Publishing.Thurlow, M. L., Elliott, J. L., & Ysseldyke, J. E. (1998). Testing students with disabilities: Practical strategiesfor complying with district and state requirements. Thousand Oaks, CA: Corwin Press.Tindal, G. (1998). Models for understanding task comparability in accommodated testing. Eugene, OR: Councilof Chief State School Officers State Collaborative <strong>on</strong> Assessment and Student Standards AssessmentSpecial Educati<strong>on</strong> Students (ASES) Study Group III, University of Oreg<strong>on</strong>.Ward, M. E., & Genshaft, J. (1983). The Perkins-<strong>Binet</strong> Tests: A critique and recommendati<strong>on</strong>s foradministrati<strong>on</strong>. Excepti<strong>on</strong>al Children, 49(5), 450–452.425 Spring Lake DriveItasca, IL 60143-20791.800.323.9540www.stanford-binet.com9-95585

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!