Friday, August 29, 2014

TKES/TAPS Observation: DOK, What to say

There is a fair amount of discussion about depth of knowledge (DOK) right now.  Some teachers think their curriculum is a mile wide and an inch deep, and they are still expected to ensure their students have a true depth of knowledge that is substantial and verifiable.  As part of my new TKES/TAPS lesson plan design (see: Lesson Plan), I included a Depth of Knowledge section that includes 4 areas of DOK.  My earlier iterations of the lesson plan had words only, but then it was suggested that we have actual sentences we might use during class.  To that end, I changed my DOK section and recommend that you include elements such as these in your lesson plan as well as statements to include when the observer is in your room.  I have a list that I keep at my podium where I can cite these, in some form or another, as needed.  As has been mentioned in other posts, my evaluator has indicated on more than one occasion that, "if it isn't in the lesson plan or isn't seen, it doesn't exist."  I intend to make sure it is in the lesson plan and it is observed.

Level 1: 
How do you define this?
Can you identify which one is the ___?
Tell me know you know ___.  (One of my favorites.)
Name all of the ___.
Recite all of the ___.

Level 2:
Identify patterns in the ___.
Compare between these elements.
How do you interpret that?
What observations can you make about that?
Show me that you know it.  (One of my favorites.)
Summarize those ideas for me.
How do you relate that to what we did before?  (One of my favorites.)
What cause and effect do you see?
Estimate that for yourself.

Level 3:
Compare between these elements.
What conclusions can you draw from that?
Critique this for me.  (One of my favorites.)
Cite evidence for your decision on that.
Construct and support your answer.
How would you differentiate between these?
Hypothesize: what would happen if ___?  (I like this one too.)
Formulate ___.

Level 4:
Analyze and synthesize information from multiple sources.  
Critique that for me.  (One of my favorites.)
Apply concepts to illuminate a problem or situation.
Connect that to something you learned the other day.  (One of my favorites.)
Create ___.
Prove ___.

Reasons: Teacher will ask students to use multiple levels of DOK during instruction, work session, and summary.

Respectfully,
Glenn

Classroom Management

We received a handout recently (yet another one) that is to serve as a "reflective instrument" to help teachers gauge their overall classroom management practice.  The list has 10 items, each answered with a Yes or No, then totaled, and compared to an Overall Classroom Management score.  This is a paraphrase from a revision of Sugai & Colvin.  I hope it serves to help you reflect on your classroom environment, your interaction, and your overall classroom management.

  1. Classroom arrangement minimizes crowding and distraction.
  2. My classroom has maximized structure and predictability (explicit routines, procedures, directions, etc.)
  3. I have taught, reviewed, and posted at least 3 positively stated expectations/rules.
  4. I provide more frequent feedback for appropriate behavior than for inappropriate behavior.
  5. Each student has multiple opportunities to respond and participate during instruction.
  6. My instruction actively engages students in observable ways (writing, telling, etc.)
  7. I actively monitor and supervise my classroom during instruction (moving, observing, etc.)
  8. For students with inappropriate behavior, I ignore or provide quick, specific redirections or corrections.
  9. I acknowledge appropriate behaviors through multiple strategies or systems (point system, praise, etc.)
  10. I provide specific feedback and correct social/behavior errors.
Score: 10-8=Super; 7-5=So, So; <5=Needs Improvement

I scored 9 out of 10; I would imagine most teachers rate 8 or higher.  In my mind, I missed #3, although I have taught and reviewed that the students are to go directly to their materials, directly to their chair, read the lesson goals/instruction on the SmartBoard, and immediately prepare for class.  I should post them however....  

Respectfully,
Glenn

Saturday, August 23, 2014

Pre-Evaluation: Polite yet official

As I am working through this TKES/TAPS process to reach my TEM score through the SLO assessments and SUI (surveys of instructional practice), it is somewhat comforting to know that my score impacts the LKES evaluation.  Translated: my annual evaluation impacts my administrators and schools evaluation; the better my score, the better their score.  It is in their best interest that I do well.

During my pre-evaluation conference, I got the sense that the administrators in charge seemed genuinely concerned that they want me to pass; that was a comforting thought.  For the first time, I felt like someone was working with & for me as opposed against me.  My coordinator said that she would e-mail my evaluator and upload the information into the GaDOE TLE website - aka, The Platform.  After she uploaded her account of my pre-evaluation, e-mailed my evaluator, and he replied, I received a notice that I needed to sign off on the pre-evaluation conference.

I went into The Platform and reviewed the section.  In the conference, I expressed things like, "I'm not sure how you would interpret this, but in my setting, this is what I do....," "The interpretation of the word 'orderly' in my room looks like....." She said things like, "You could put a copy in your lesson plan binder" and "You differentiate all the time by the very nature of your performance tasks."

However, in The Platform, I noticed that there were words like, "Dr. Cason had some concerns about data....," "I advised that this data....," and "I advised him to keep a copy in his lesson plan binder....," "We discussed that 'orderly' does not mean 'quiet'," and "We discussed students as self-directed learners and what that looks like...." [emphasis added].  I've been around long enough to know that the warm, fuzzy-discussion-turned-cold-worded advisement is a clue.  They're watching their back; I better watch mine.  I hope to help you too....

The point: while the pre-evaluation conference was informative and polite, the documentation loaded into The Platform was very official language.

I noticed that at the end, you are supposed to sign off and agree with what was posted.  Below the button was something to the effect that once you agreed to it, you would not be allowed to edit it any more.  I didn't sign off on it.  I was concerned about the wording, so I created a reply in a document.  It was a long, 4-paragraph reply addressing each area noted in The Platform.  I copy and pasted the reply into the form, then clicked accept or agree or submit.....

TLE Electronic Platform

This information is quoted from a document from the GaDOE TLE Electronic Platform dated August 5, 104 [sic] entitled, "Just for Teachers!  August"  All Rights Reserved.
  1. Why do I need to upload documentation for every standard?  Documentation is not required for every standard.  Evaluators may request documentation from a teacher when a standard is not observed during a formative observation, walkthrough, or when the consistency of a teacher's practice cannot be established [emphasis added].  The teacher is responsible for submitting requested documentation in a timely manner.
  2. Why will my TEM (Teacher Effectiveness Measure) be based on last year's student growth?  Student Growth (Student Growth Percentiles) and/or Student Learning Objectives data are finalized during the summer of each year, and after the completion of the school year and the evaluation cycle.  Therefore, student growth is lagging data and is used in the following school year's evaluation cycle and will contribute at least 50% towards the Teacher Effectiveness Measure (TEM).  [emphasis added]
  3. Do the student surveys count a percentage toward my TEM?  No, these results do not count a percentage toward a TEM.  Results of the Surveys of Instructional Practice (student surveys) are used as a source of documentation to assist evaluators in rating Performance Standards 3, 4, 7, and 8 on the Summative Assessment [emphasis added].
  4. How much training did my administrator receive to evaluate teachers?  All administrators who perform evaluations on teachers must participate in a GaDOE approved TKES training and pass a credentialing assessment at the end of the training.
  5. No one in my school receives Level IV ratings.  Are they impossible to achieve?  No, ratings of Level IV are not impossible to achieve.  Teachers who perform at Level III and go beyond could certainly earn Level IV ratings.  In addition to going above and beyond a Level III rating, teachers must be a role model or teacher leader [emphasis added].

Wednesday, August 20, 2014

Pre-Evaluation Conference

Today I had my pre-evaluation conference.  I had the opportunity to have one in a group meeting or one-on-one with my Evaluation and Assessment Coordinator (EAC, a new job position for our county who is in charge of organizing the TKES process - and probably the new standardized tests).  I requested a one-on-one after our meeting because I heard in the meeting that the administrators/evaluators had actually reviewed our TKES/TAPS self-assessment and would use that information in working with us--that caught my attention, and I decided I wanted to know how my self-assessment would impact me.

After our conference, I got every impression that the self-evaluation would serve as areas of "strengths" and "weaknesses" for them to review as part of their observations.  In addition, I got every impression that the observers would be very careful to notice each and every aspect of the 72 TAPS (Teacher Assessment on Performance Standards) indicators - even though they are not a checklist!  (so they say....)  As we were discussing the impending initial observations, whether they had to printed & in a notebook or could be available on the SmartBoard when requested, it became evident that I should show my EAC what my lesson plans looked like.  She had indicated that "if it isn't in the lesson plan and isn't seen, then the observer would have to ask for further documentation" (I think there is a 24 hour window there to provide documentation).  She reviewed my lesson plan for today, I showed her the different elements, and she seemed like everything the observer would need to see was in there.  When she had a question about, "Where is the differentiation ____?" I showed her where it was, and she seemed satisfied.

I've added about 5 more documents for my notebook (today), will add the summative assessment results from tomorrow to show "data to support differentiation," and will add a couple more articles from research organizations re: 1) Does requiring more math classes each day help kids who struggle with math?  (Answer, No.), 2) Has requiring more math and science classes in the high school increased America's rank in math/science & preparation for the high-tech job market?  (Answer, No.)

Summary: take this very seriously; document everything, put a number on the back of each paper indicating which indicator(s) it will satisfy (i.e., 9.5); create or use a comprehensive lesson plan that has each TAPS element you can put in there.  If you need suggestions, look here or e-mail if you need one specifically for you....

Thanks for checking in.  If you have specific questions, please feel free to e-mail me at: gcason123@gmail.com and I'll get to you as soon as I can.

For those drowning in education acronyms, let me add this in what I believe is hierarchical form:

  • TKES: Teacher Keys Effectiveness System is the entirely new evaluation system passed by the Georgia legislature that will be used for all Georgia teachers of record.  It replaces GTOI....oops, sorry: the Georgia Teacher Observation Instrument  :-)
  • TAPS: Teacher Assessment on Performance Standards is the set of 10 Standards and 72 Elements/Indicators (not to be used as a checklist) that outline an effective teacher's actions.
  • EAC: Evaluation and Assessment Coordinator is a new position in our county, barely similar to the "Instructional Lead Teacher" (ILT) from years ago, that helps coordinate the TKES training, SLO administration, and coordination between the administrators, county, and teachers.  If you don't have one specifically, who is helping you out?
  • SLO: Student Learning Objectives is the set of concepts &/or skills (written &/or performed), written by your county teachers, given in a pre-test/post-test format to gauge the amount of student achievement under a specific teacher's direct instruction in non-standardized testing subjects (Connections, non-academic classes, non-CRCT classes, etc.)  The number and amount of student achievement gain will have a significant impact on a teacher's year end evaluation.
If you think I could be helpful to you, your teachers, or your administrators, please feel free to contact me.

Tuesday, August 19, 2014

53 Ways to Check for Understanding

I received this today from an Administrator.  It may be a bit of overkill, but if you need some ideas on how to really determine if your students understand what you are presenting, these are very creative and would serve many different age levels, grade levels, and subject matters.

Check out the file under the page: TKES 53 Ways to Check for Understanding
(https://sites.google.com/site/gcason123/tkes-blog)

Respectfully,
Glenn

This is important, but it won't matter...

If you have read the previous blogs, you should notice that there are practical, real concerns in developing and administering the S.L.O.s (student learning objectives) as part of the new Georgia TKES (Teacher Keys Effectiveness System).  If you haven't read them, please take a moment to do that now....  I'll wait....  Researchers have now compiled evidence that should be taken notice by those in charge (although it is too late for that).  The following statements are from a recent article released by the IES (Institute of Education Sciences); if you haven't signed up for their automatic e-mail notices, you should - it's good stuff - it's the real deal.  I have put a copy of this article in my TKES/TAPS notebook (a picture of my notebook can be seen in a previous blog).

So, from: Gill, B., English, B., Furgeson, J., & McCullough, M. (2014).  Alternative student growth measures for teacher evaluation: Profiles of early-adoption districts.  (REL 2014-016).  Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Mid-Atlantic.  Retrieved from http://ies.ed.gov/ncee/edlabs:
SLOs can be used for teacher evaluation in any grade or subject, but require substantial effort by teachers and principals, and ensuring consistency is challenging [emphasis added].
Use of alternative growth measures that do not depend on state assessments is recent, and little is known about their validity and reliability [emphasis added] or about how they are being used. 
I have subtly raised concerns about the reliability of our SLOs - it is assumed that the teacher will provide a substantial performance task(s) (based on the end of the year expectations) that will show student growth, but because of time limitations, lack of preparation, lack of knowledge, etc., it is conceivable that it may not have happened - it is an unknown variable.  Thus, to me, the reliability of our SLOs is in question....

I have an excellent article, but it is in my TKES/TAPS notebook at school; I'll bring it home and give you the information soon.  The researchers show that SLOs really shouldn't be used in teacher evaluations at all....  It is from the same research institute.

If you have comments or questions, let me know....  gcason123@gmail.com; online portfolio is: https://sites.google.com/site/gcason123/

Friday, August 15, 2014

TKES: SLO, Day 2-Timing

Note: these comments are for the benefit of those who still have time to plan and or make adjustments to their pre-tests.  I thought ours were going to be great, flawless, and easy...that has not been the case.

Legislators should take note about what they have required us to do.  The teachers who were involved in creating our SLO (Student Learning Object) test as a part of the TKES (Teacher Keys Effectiveness System) evaluation did the best they could with the information they were given (I was one of those teachers).  In hindsight, we needed more and better information at the beginning and more time to develop and test for a truly valid and reliable instrument.

For example: I think some teachers had created a written portion of the SLO that had over 100 questions, some with 35 questions, some with 65+; apparently some of the SLOs cover portions of the entire curriculum, some that cover only the major parts (where the most time is spent during the year).  Again, it seems as if there are so many variables, the validity and reliability could come into question (IES and REL have already spoken on this matter).  While I think that my giving my post-test to the same students will reveal valid results, I don't think that it would reliable across the subject matter in my county; there are certain performance aspects that each teacher is to develop for their class.  This performance aspect should be well thought out, prepared, organized, and an effective measurement of the performance task.

Timing:

  • In our county, we are supposed to administer the SLO during the first 2 weeks of school.  This is the same time period where the students are still shifting in their Connections classes, students are entering the school and moving to another school.  The rosters are changing, being printed, put in teachers' boxes for the next day; but, the computer system doesn't update until after 12 midnight.  Therefore, the rosters in our boxes are not correct for the students who should be in the class.  We had to individually tell students what classes to go to - as they're walking in the door.  This took time out of our testing window.
  • It is possible that the written portion of a SLO may be given for at least 45 minutes.  The Connections class period is 45 minutes - which includes the transition to the class, calling roll, setting up for the test, etc.  Therefore, unless the Connections classes are only meeting the A block for 90 minutes, a regular Connection class of 45 minutes can't meet the requirements for the test.  Some schools are extending the A class (and B class the next day) for the 90 minutes, others are not.  This timing issue would never be allowed in the CRCT (which is now gone away for Georgia).  It appears that there was an assumption that each Connections class would be extended for the entire 90 minute block: Class A on Day 1, Class B on Day 2.
  • Some of the performance aspects for some of the Connections classes are given 5 minutes per student to demonstrate mastery.  This creates 2 huge issues:
    • There is a teacher in the county that has 70 students in one class....Do the math.  How many class periods will that take to administer just the performance aspect of the SLO?  14 days - if everything works perfectly.
    • Some teachers have completed the written portion of the SLO but now have to complete the performance (or spoke/language) portion next week.  That's fine except we have ITBS testing Monday - Thursday AND CogAT testing next week.  The entire school schedule is adjusted and Connections classes will not be more than 35 minutes each - and that does not include the transition to class, calling roll, distributing material, etc.  So, if a teacher has 30 students and has to give 5 minutes to an individual student demonstrating their performance, but only has 25 minutes of class time, how many DAYS will it take to pre-assess the class?  I count 6 DAYS.  6 days for performance, 1 day for written this 9 weeks; then 7 days again at the end of the year: that is almost 3 instructional weeks missed due to testing....
I recommend that you get with your administrator(s), TKES evaluator, or assessment coordinator and try to work out as many bugs as you can before the day of the test.  If you would like advice or other thoughts, please contact me....

Thursday, August 14, 2014

Real TKES, Day 1: S.L.O. pre-test

Oh my goodness - what a day.  I gave my first SLO pre-test to my 6th grade class today.  Compared to the CRCT, ITBS, CogAT, and SAT tests, the S.L.O. tests don't feel quite as "standardized."  This has led to a lot of questions regarding the validity and reliability of the tests - quite founded I might add.  Despite the best intentions of the personnel who created the tests, the directions, and developed the SLOs, clearly, more work needs to be done.  I got the impression that this was legislation passed down onto public education from people who did not comprehend the process of education.  Not that I'm a big fan of the process because I have worked a long time to make it more efficient and effective, but the public education behemoth can only move so fast...

Now, having said that, here are some observations from today - my class and colleagues' classes - perhaps these may assist you.


  1. The students need to arrive on time for class - just like Administrators do for the CRCT.
  2. For some of the SLOs, there are performance tasks.  The teacher in charge will need to see the SLO questions, or at least the performance section of the test, so that preparations can be made!  In HomeEc, many materials have to be set up before the testing, in band, a 4-measure rhythm has to be determined, a band excerpt has to be selected, a band performance piece has to be chosen, and a technical exercise must be selected; teachers were calling me 20 minutes before the test asking what was required, exactly.
  3. The administrator/evaluator in charge of scheduling must be aware of the time allotments for the tests.  
    1. One test had three portions: a written portion (45 minutes minimum), a performance portion (individual performance [5 minutes per student] and whole group performance  [15 minutes]), and an evaluation/listening portion (5 minutes).
    2. It was assumed in the writing of the SLO that the 45 minute Connections class period would be extended to 90 minutes so that all parts of the test could be administered in one day.  In School A, the period was extended to 90 minutes; in School B the class period was kept the same (45 minutes).  School A made it through all portions of the tests in the 90 minutes.  School B can't fulfill the requirements because the students have to enter the room, get pencils, distribute the tests and answer sheets, fill out their name, etc., etc.  Impossible to accomplish if the schedule isn't adjusted.
    3. If students are required to sing/play/demonstrate a mastery of some skill and the time is allotted for 5 minutes per student, what does the chorus teacher do who has 70 students in first period?  You do the math of how long that will take - just for the performance piece.
  4. Be sure that the students who have IEPs are not pulled out too soon.  Some of ours were pulled out during home room, and then there seemed to be a scurry to make sure they could demonstrate all of the performance aspects that were required during the time allotted.
In short, organization is key, but since most schools have not implemented these SLOs before, the administrators may not be as keen on the details to making it run smoothly.

If you've made it this far in the BLOG, congratulations!  You should know that I was on the original committee to write the SLOs (and I think our team did a GREAT job), I helped on the second review committee to make revisions, and helped communicate with other teachers in the county so that our process (as a subject matter) would go smoothly!  

Day 2 and on: scoring the papers, giving totals, tabulating the results.  Should be fun.

Good luck, and best wishes.  Respectfully,
Glenn

Wednesday, August 13, 2014

Data

Data is becoming a re-occurring theme at my school.  While this is nothing new to most of us, it's implications and use - in the TKES process - probably are.

Consider Standard 2: Instructional Planning: The teacher plans using state and local school district curricula and standards, effective strategies, resources, and data [emphasis added] to address the differentiated needs of all students. It is because of the lack of specific data that a teacher in a pilot program initially received a Level II in this area.  Ultimately, through more observations and her ability to document how she was using data, she scored Level III.  But, the initial scare (of not passing) was real.

Currently, we're being told to carefully consider 2.1: Analyzes and uses student learning data to inform planning [emphasis added].  To me, that means that my current formative assessments are helping me determine the lesson plans leading to the summative assessment.  My summative assessment is planned, but my lesson planning is flexible to make sure the students arrive at the summative prepared and on time.  I don't want my students "just" to make an A, I really want them to learn the material and demonstrate that by making an A!  I want them to be changed people because of my instruction! 

After my summative assessment, I will log the grades, print the report, and store at least one grade document in my notebook under 2.1 with a note in how the summative report will - or did - guide and direct my lesson plans for the next summative assessment.  Essentially, if my students hit the target for this assessment, I'll keep the plans similar; if they don't demonstrate mastery, then I'll adjust my lesson plans.  We, probably, all show that our data guides how we give instruction to the next unit, but now we have to intentionally make those decisions and document it.

In short, if the lesson process worked, keep it; if it didn't, analyze the test scores and change the instruction to adapt.

As I noted on pp. 51-52 of my doctoral study
Formative assessments.  McLeod (2005) noted that meta-analytic research showed effective formative assessments have a greater impact on improving student achievement, including closing the achievement gap, than “any other instructional practice” (p. 4), supply updated information to allow for redirected instruction, and can serve as benchmarks for annual learning goals.  In the classroom setting, McLeod asserted that, “data analysis should cause targeted instructional changes to improve student learning” (p. 5) and student data should be a part of continuous instructional improvement (Black & Wiliam, 2009; Hamilton et al., 2009; Huebner, 2009).  Teachers should make instructional decisions based on data from their students’ work accordingly (Lieberman & Miller, 2001), and formative assessments serve as guides for students’ progress toward annual learning goals (Huebner, 2009; McLeod, 2005).  Researchers found that effective formative assessment practices have shown to be powerful tools to improve student achievement and that formative assessments provide updated information to which the teacher could allow for redirected instruction that could benefit student learning (Huebner, 2009; McLeod, 2005; Popham, 2009b).  Formative assessments can help develop the student-teacher interaction, student motivation, and student achievement (Brookhart et al., 2008; Wiliam, 2007) and can highlight student accomplishments (Tomlinson, 2007).  Researchers showed frequent formative assessments revealed students’ thinking (Bransford et al., 2000) and could provide a “realistic measurement of students’ progress” (Dochy et al., 1999, p. 170).  The initial content or unit lesson plan should allow for predesigned formative assessments (Black & Wiliam, 2009).  As indicated in the March 2010 edition of the First Bell newsletter, the superintendent (Local County School District, 2010a) noted that teachers in Local County use informal benchmark information to design classroom instruction.

Cason, M. G. (2011). Activating Prior Knowledge With Cues and Questions As a Key Instructional Strategy to Increase Student Achievement in Low Socioeconomic Middle Schools. (Ed.D. 3469058), Walden University, United States -- Minnesota. Retrieved from http://ezp.waldenulibrary.org/login?url=http://proquest.umi.com/pqdweb?did=2459520451&Fmt=7&clientId=70192&RQT=309&VName=PQD

In short(er): give an assessment, print out the report, store it under Standard 2, and show in your lesson plans how that assessment is guiding your instruction/curriculum/planning for the next unit.  I included in my lesson plans a statement that includes, "____ (Performance Task(s), Post-Test, Unit Test, Section Test) will guide future ___ (today's instruction, future planning, next assessment, next content, next unit)"

Hope this helps.  See my website for more information: https://sites.google.com/site/gcason123/

Tuesday, August 12, 2014

Quote of the day...

This is one of the quotes that came out of our TKES training meeting today.  It was regarding the increased pressure and "quality control" feelings from the teachers and how to manage the stress:

"Don't get TKEd off!"

We may get shirts printed....  :-)

Sunday, August 3, 2014

Fact Sheet #15: Documentation

Georgia Department of Education. (2014). Teacher Keys Effectiveness System.  Atlanta:  Retrieved from http://www.gadoe.org/School-Improvement/Teacher-and-Leader-Effectiveness/Documents/TKES%20Handbook%20FINAL%207-18-2013.pdf.

p.67

Fact Sheet #15–Documentation
DOCUMENTATION AS A DATA SOURCE FOR TEACHER EVALUATION
Introduction
Documentation of a teacher’s performance can serve as valuable and insightful evidence for detailing the work that teachers actually do. Evaluators may request documentation when a standard is not observed during an announced or unannounced observation. Documentation should emphasize naturally-occurring artifacts from teachers’ work (i.e., lesson plans, instructional units, student assessments).

Documentation of teacher practice and process is an important part of a comprehensive approach for documenting teacher performance. Generally, a teacher’s evaluation documentation is considered to be “a structured collection of selected artifacts that demonstrate a teacher’s competence and growth”.1

Documentation serves as a system for collecting data and recording work quality during each evaluation cycle. Specifically, the documentation houses pertinent data that confirms the teacher meets the established performance standards. Written analysis and reflection about artifacts often are included in the documentation to provide insight into the rationale for the events and process documented in each entry. Documentation is designed to serve as a complement to other data sources in order to provide a fuller, fairer, more comprehensive view of teacher performance.

Advantages of Documentation
  • The artifacts included in documentation provide evaluators with information they likely would not observe during the course of a typical classroom visit.
  • Documentation provides the teacher with an opportunity for self-reflection, demonstration of quality work, and a basis for two-way communication with an evaluator. Tucker, Stronge, and Gareis discussed the beneficial nature of documentation by pointing out it is: “Appealing for many reasons, including their authentic nature, recognition of the complex nature of teaching, encouragement of self-reflection, and facilitation of collaborative interaction with colleagues and supervisors… [It embodies] professionalism because it encourages the reflection and self-monitoring that are hallmarks of the true professional.”2

Concerns of Documentation
  • When goals and standards are not determined, the result can be unfocused and haphazard. The materials included could be idiosyncratic and biased.
  • When goals and standards are not determined, the result can be unfocused and haphazard.  The materials included could be idiosyncratic and biased.

How Is Documentation Aligned with the Teacher Standards?

Documentation contains a broader, more comprehensive collection of naturally-occurring materials than other data sources. A variety of evidence may go into documentation, such as: student work; unit/lesson plans; student assessments; evidence of professional development activities; professional publications; recording of teaching; samples of instructional materials; diagrams of classroom arrangement; summary of analysis on longitudinal student test scores; evidence of help given to colleagues; information from others, such as observation of teaching by qualified others; and significant correspondence and memos.3  Therefore, it is capable of providing teachers with an opportunity to demonstrate professional competence with regard to meeting standards identified in the evaluation system.

Examples of Documentation Evidence

Georgia Department of Education. (2014). Teacher Keys Effectiveness System Handbook.  Atlanta:  Retrieved from http://legisweb.state.wy.us/InterimCommittee/2012/TKESHandbook.pdf
“All Rights Reserved”


p. 64
Examples of Documentation Evidence


Evaluators may request documentation from teachers when a standard is not observed during an announced or unannounced observation. The examples below will provide ideas that may be helpful when needing further documentation. This is not a comprehensive list of examples and should not be used as a checklist. [emphasis in original] Documentation may also need to be supplemented with conversation, discussion, and/or annotations to clarify the teacher’s practice and process.
Standards and Examples of Documentation
1. Professional Knowledge
  • Summary of a plan for integrating instruction
  • Class profile
  • Annotated list of instructional activities for a unit
  • Annotated photographs of teacher-made displays used in instruction
  • Annotated samples or photographs of instructional materials created by the teacher
  • Lesson/intervention plan (including goals and objectives, activities, resources, and assessment measures)
2. Instructional Planning
  • Course Syllabus
  • Lesson Plan
  • Intervention Plan
  • Team/Department Meeting Minutes
  • Substitute Lesson Plan
3. Instructional Strategies
  • Samples of handouts/presentation visuals
  • Technology samples on disk
  • Video of teacher using various instructional strategies
4. Differentiated Instruction
  • Summary of consultation with appropriate staff members regarding special needs of individual students
  • Samples of extension or remediation activities
  • Video or annotated photographs of class working on differentiated activities
  • Video of teacher instructing various groups at different levels of challenge
5. Assessment Strategies
  • Copy of teacher-made tests and other assessment measures
  • Copy of scoring rubric used for a student project
  • Summary explaining grading procedures
6. Assessment Uses
  • Brief report describing your record-keeping system and how it is used to monitor student academic progress
  • Photocopies or photographs of student work with written comments
  • Samples of educational reports, progress reports, or letters prepared for parents or students
7. Positive Learning Environment
  • List of classroom rules with a brief explanation of the procedures used to develop and reinforce them
  • Diagram of the classroom with identifying comments
  • Schedule of daily classroom routines
  • Explanation of behavior management philosophy and procedures
8. Academically Challenging Environment
  • Samples of materials used to challenge students
  • Samples of materials used to encourage creative and critical thinking
  • Video of lesson with students problem-solving challenging problems
9. Professionalism
  • Documentation of presentations given
  • Certificates or other documentation from professional development activities completed (e.g., workshops, conferences, official transcripts from courses, etc.)
  • Thank you letter for serving as a mentor, cooperating teacher, school leader, volunteer, etc.
  • Reflection on personal goals
10. Communication

  • Samples of communication with students explaining expectations
  • Parent communication log
  • Sample of email concerning student progress
  • Sample of introductory letter to parents/guardians
  • Sample of communication with peers

Student Surveys of Instructional Practice

Georgia Department of Education. (2014). Teacher Keys Effectiveness System Handbook.  Atlanta:  Retrieved from http://legisweb.state.wy.us/InterimCommittee/2012/TKESHandbook.pdf
“All Rights Reserved”

p. 40
PART III: Surveys of Instructional Practice

Another measure of the Teacher Keys Effectiveness System consists of student surveys of instructional practice. Surveys are an important data collection tool used to gather client (in this instance, student) data from individuals regarding the clients’ perceptions of teacher performance. Among the advantages of using a survey design include the rapid turnaround in data collection, the limited cost in gathering the data, and the ability to infer perceptions of a larger population from smaller groups of individuals. In the Teacher Keys Effectiveness System, surveys will be used as a measure of teacher effectiveness and documentation to support four of the TAPS standards. These four standards: Standard 3- Instructional Strategies, Standard 4-Differentiated Instruction, Standard 7- Positive Learning Environment, & Standard 8-Academically Challenging Environment reflect the direct experience of students in classrooms.

Multiple data sources enable the evaluator to obtain a more accurate picture of performance and assist the teacher in increasing student success. These data sources do not stand-alone but are complementary to each other and should be integrated into the process of evaluation to provide a richer portrait of teacher performance. The flaws of one data source are often the strengths of another, and by combining multiple methods, evaluators can make more solid judgments regarding teacher performance and make decisions that are supported by multiple types of data.  Student surveys may help the teacher set goals for continuous improvement (i.e., for formative evaluation) — in other words, to provide feedback directly to the teacher for professional growth and development. Student surveys also may be used to provide information to evaluators that may not be accurately obtained during observation or through other types of documentation.

The surveys ask students to report on items they have directly experienced. Three different versions of the student survey (grades 3-5, 6-8, and 9-12) will be provided. The versions are designed to reflect developmental differences in students’ ability to provide useful feedback regarding their teacher. All surveys are to be completed anonymously to promote honest feedback.

Student Learning Objectives (SLOs)

Georgia Department of Education. (2014). Teacher Keys Effectiveness System Handbook.  Atlanta:  Retrieved from http://legisweb.state.wy.us/InterimCommittee/2012/TKESHandbook.pdf
“All Rights Reserved”

p.27

Student Learning Objectives: District-determined SLOs are content specific, grade level learning objectives that are measureable, focused on growth in student learning, and aligned to curriculum standards. As a measure of teachers’ impact on student learning, SLOs give educators, school systems, and state leaders an additional means by which to understand, value, and recognize success in the classroom.

The primary purpose of SLOs is to improve student achievement at the classroom level. An equally important purpose of SLOs is to provide evidence of each teacher’s instructional impact on student learning. The process of setting and using SLOs requires teachers to use assessments to measure student growth. This allows teachers to plan for student success by ensuring that every minute of instruction is moving students, teachers, and schools toward the common vision of exemplary instruction and high levels of student academic growth. The Student Learning Objectives Operations Manual which is located on SharePoint has detailed information and forms regarding SLO development.

Summative Assessment

Georgia Department of Education. (2014). Teacher Keys Effectiveness System Handbook.  Atlanta:  Retrieved from http://legisweb.state.wy.us/InterimCommittee/2012/TKESHandbook.pdf

“All Rights Reserved”

p. 17
Suggestions

When it comes time to conduct the formative and summative assessments, evaluators must rate teachers on all ten performance standards. Consequently, as evaluators conduct observations and review documentation, it is important that they keep all ten standards in mind. When conducting walkthroughs, evaluators should focus on a limited number of performance standards and/or indicators. They may find it useful to annotate the TAPS Reference Sheet as to which data source (observation and/or documentation) is likely to provide evidence related to a particular standard. Evaluators also may find it useful to review the teacher-generated listings from the Look Fors and Red Flags activity, and the Matching Observation and Documentation with Performance Standards activity used during the Orientation and Familiarization sessions with the teachers.

p. 20
Summative Assessment

After collecting information throughout the evaluation process, evaluators will provide a summative assessment of a teacher’s performance. Evaluators will use the Summative Assessment Report Form to evaluate performance on each standard using the four-category rating scale. By receiving a rating on each individual standard, the teacher is provided with a diagnostic profile of his or her performance for the evaluation cycle.

In making judgments for the summative assessment on each of the ten teacher performance standards, the evaluator should determine where the “totality of the evidence and most consistent practice” [emphasis in original] exists, based on observations, documentation of practice and process provided by the teacher, and Surveys of Instructional Practice. “Totality of the evidence and most consistent practice” [emphasis in original] as used here is intended to mean the overall weight of evidence. In other words, as applied to the four-point rating scale, the evaluator should ask, “In which rating category does the totality of the evidence fall?” In many instances, there will be performance evidence that may fit in more than one category. To reach a decision for aggregating the total set of data to reach a summative decision, the evaluator should ask “In which rating category does the evidence best fit?”

In addition to the ten separate ratings, the teachers will receive an overall TAPS point score.  Exemplary ratings are worth 3 points, Proficient ratings are worth 2 points, and Needs Development ratings are worth 1 point. Ineffective ratings have no point value. Through the GaDOE TLE Electronic Platform, evaluators will receive a point value for all ten standards which will produce a final TAPS score.