Sunday, April 19, 2015

Student Survey Questions: valid? reliable?


As the TKES year draws to a close, it appears "the system" has some significant flaws.  There are some good elements to the new system (the TAPS in general are a good start), but other aspects are woefully unscientific, unreliable, even unprofessional.  This post focuses on the Student Survey Questions of Instructional Practice for grades 6-8.
  1. It is possible that the student survey questions will not and can not lead to valid and reliable results (but we will use the results to evaluate teachers anyway).
  2. The questions themselves can not apply to all classes or all teachers or all grade levels.
  3. Most of the questions themselves are impossible to score a "strongly agree" due to the fact that a 12 year old is answering the question.  (I typically do not "strongly agree" or "strongly disagree" to anything.  Better choices would be: Most of the time, Usually, Sometimes, Not very often, etc.)
  4. Because the questions can not apply to each teacher, to determine teacher effectiveness, the teachers will be compared to other teachers in the school, the district, and the state (thus the "School, District, and State Mean" and "State Median" columns).  Meaning, our effectiveness is not being graded against a teaching standard (the survey question), but against other teachers.  (What if we graded students against each other and not the curriculum standard?  That would be nonsense.)
Question #6: "My teacher chooses activities and assignments based on what students need to learn."  Yes, all the time, during class, for homework, for the day, the week, nine weeks, the year, and for all three years in middle school.  What else do teachers do?  How could a student possibly answer anything other than "strongly agree"?  Because they do not know; they could not know.  Anyone without strong pedagogical training, knowledge, and practice is not even qualified to answer that question.  Teachers even collaborate across grade levels so that there is continuity of learning in middle school.  We are not having students fill out word search puzzles.  87% of my students answered positively to that; however, at least one student answered strongly disagree to all questions.

Question #7: "My teacher gives students as much individual attention as they need [emphasis added] to be successful."  Laughable survey question.  58% of my students answered positively to that.  Think of what a classroom would look like to have 90% of the students answer "strongly agree" to that (I can not even imagine); remember most classrooms are overcrowded....

Question #10:  "My teacher allows me to work with different groups of students depending on the activity we are doing."  Of course we do, any time, all the time, every time - depending on the activity that is assigned to accomplish the learning goal as determined by the teacher so that the students will learn the curriculum.  But, it does not mean everyday or whenever the students want to.  The long-believed, and growing perception that collaborative learning is as effective as direct instruction is wrong.  The conditions of effective collaborative learning to mean direct instruction are so high, no one meets them.  Stacks of high-quality research show that over and over again (I can send you sources if you want because it was an influence in my doctoral study).  I received 47% positive response on that question.  Again, anyone without strong pedagogical training, knowledge, and practice is not even qualified to answer that question.

It appears to me that if I want higher scores on my survey results, I will need to create (another) system for next year to teach middle school students not only the state curriculum of concepts and skills, but also how to identify and understand the Teacher Performance Standards, pedagogy on something they have no knowledge of (so I can get better survey results), and master the Student Learning Objectives.  If only I could see my students for a full class period each day.

Saturday, April 18, 2015

Student Surveys of Teacher Instructional Practices: absolutely laughable, if it weren't true


  1. The use of student surveys (for middle school: 11-13 year olds) to impact teacher evaluation, TEM score, effectiveness, retention, and certificate, is absolutely laughable - if it weren't true.  Because it is true, it is horrifying and ridiculous.  Now teachers need to teach the students how to evaluate their teacher; here's what summary looks like, here's what differentiation looks like, here's what asking questions looks like, etc.  More later.
  2. It seems reasonable that if a child - students - could take a survey that would impact a teachers' -professional educator's - job, why couldn't we create a survey of our students that could count toward their grade (i.e., a summative test grade).  More later.
  3. A friend of mine, who has a birds eye view of many things, has the opinion that this entire TKES process was developed to create an evaluation system which has the ability to create a sense of outrage and distrust in the Georgia public education system so that public sentiment would move toward supporting Georgia's elected politician's push toward more charter schools (which was just voted on and approved), allowing for schools, teachers, and benefits to move out of the GA DOE funding into privatized companies allowing schools to function outside of the GA DOE which moved from the lack of governmental funding public schools years ago which, in its fullness, will create a disparity of high functioning charter schools and low functioning public schools creating a reinvented system of segregation--this time, between the haves and the have nots.  TKES isn't about providing an improved valid teacher evaluation, it's about creating political movement toward charter schools.  The opinion is hard to argue against; time will tell.
  4. A teachers, as with all other evaluation systems, is in the hands of the interpreting evaluator.  A teacher with less than 5 years experience has seven IVs and three IIIs on his/her summative evaluation.  Another teacher with minimal experience has five IVs and five IIIs.  Another teacher with less than 5 years with experience who provides differentiated experiences for his/her students every day receives IIs.  One evaluator interprets by the letter of the rubric, another interprets: "if you could teach a class on this Standard, you should receive a IV."  ...completely different interpretations, different grades, and different summative outcomes for the teacher--connected to their certificate.  Word is that if you receive a II in any Standard, you may not be eligible for interview or hire in some counties.
  5. The survey questions do not/can not apply to all subject matters; this creates validity/reliability problems (yet, we are going to evaluate and judge teachers anyway.  See #3.)
For example:  Survey Question #4: "My teacher takes time each day to summarize what we have learned."  My score: 1.95 (17% strongly agree, 37% agree, 29% disagree, 15% strongly disagree).  Clearly 12-year-old students are clueless (because they are only 12), I summarize more than one time per class period, and now it seems teachers need to teach students how to answer survey questions (in lieu of time spent on standards) to keep their jobs - or at least avoid uncomfortable summative TKES conferences regarding their student survey results.  I graduated with High Honors (high school), cum laude (college), 3.8 (masters), and a 4.0 (doctorate) in my degrees and apply what I know, and constantly use formative assessments during class so my students will learn efficiently, and now I have to figure out a way for 12 year old students to recognize and approve of my teaching strategies and methods to validate my abilities.  Something seems out of place here (see #3).

So, 44% of my students do not think I summarize the concept each day.  Hummmm.  Let's see.  Suppose I teach music.  I teach a new note, I give the location on the staff, I give the name, I give the fingering, I let the students see it and finger it, I give how to play it, I give how it is used, I show them why the note is the way it is, I have the students play the note, I double check (through formative assessments) that the students understand and can perform the note, I have them play the note, and then I have the students play the note in new music.  If that is not summarizing of the new note, I do not know what is, but 44% of my students do not think so.  The same would go for rhythms, dribble a basketball, sing pitch, sew a straight line, or draw a perspective.  In many performance-based classes (Connections classes in particular), concept is about 20% and performance/skill is 80%; therefore, playing/singing/drawing/performing the new concept is summarizing.  So, if teachers are going to improve their student survey averages and TEM score in the coming years, it appears we will need to find new ways of teaching the standards, documenting performance for evaluators, and teaching students how to answer the survey.

Thursday, March 19, 2015

75 Instructional Strategies

I decided to type up a list of the different ways I teach the material that is presented in class.  Some of the ideas are common instructional strategies, some are well known, but some are not thought of as an typical "instructional strategy," but they are.  These strategies help students learn.  This list was developed just by thinking through my day and took only as much time as it took to type.  I intend to continue the list as I come across other ways I present material, have students engage in the material, have students demonstrate mastery of the material, or interact with the curriculum.  

Please see the page, "75 Instructional Strategies" or "75 Instructional Strategies-list" above.

If you have other ideas that could be included in the list, please let me know; I would like to try them.

Respectfully,
Glenn

Tuesday, March 10, 2015

...so, how is your TKES process going?

There have been a couple thousand hits to the TKES Blog since I started it--which is a surprise to me.  In the same idea, there has been a significant increase of hits in the last few weeks.  There has been a lot of interest on setting up the notebook, getting organized, evidence for 3.6, examples of document evidence, teacher evidence and documenting performance, uploading documents to The Platform, TKES Specific Ideas and Essays, Handbooks, Fact Sheet #15 Documenting Performance, and Classroom Management to name a few.

So, I'm curious if there is anything you have questions about, need help with, would like advice about, or have a topic that you would like me to either comment on or research.  All inquiries would be confidential.  My evaluations have gone well, but not quite like I would like - I wanted all 4s in all 10 categories.  Even though I have significant evidence available on site and have uploaded to The Platform (200-300 documents), I think I will not get 4s in all areas - the wording of some of the categories probably not allow it.  You are welcomed to send me an e-mail at gcason123@gmail.com for your thoughts, ideas, recommendations, suggestions, comments, requests.

Respectfully,
Glenn

Saturday, February 14, 2015

In: Georgia Milestones (standardized test)---Out: CRCT---In: TKES---Out: Standardized Tests for Evaluating Teacher's Effectiveness for Student Achievement

The use of the standardized test that was to judge Georgia teachers' effectiveness on student achievement and therefore contribute significantly to the calculation for the Teacher Keys Effectiveness System, and ultimately the teachers' annual TEM score, is on hold - at least for one year.

Look at the article reporting on the State School Superintendent's thoughts here: AJC
Testing: Saying there is an overemphasis on test scores, Woods added, “We must aggressively lessen this burden.” He also wants a longer moratorium on using scores from the new Georgia Milestones k-12 tests, which roll out this year, to retain children or evaluate effective teaching. 
We were also presented with this update at our school.

(Sorry for the educational alphabet soup, but...)  
As I understand it; start out: Annual TEM score for teachers calculated through the TKES process which is composed of three parts: 1) calculations of student achievement gains determined through teacher-generated SLOs or standardized test results, 2) teachers receiving scores (1-4) on the 10 Standards and 72 Elements outlined in TAPS, and 3) student survey results.

Not even 12 months into the system, currently: 1) student achievement gains from SLOs thrown out [research indicates SLOs can be considered neither valid or reliable], 2) student achievement gains from standardized tests thrown out [my understanding: GA Milestone tests have not even been field tested for validity and reliability].  I would not be surprised if the TEM is thrown out before the end of the school year; that would leave TAPS and the student survey results.  I'd vote for keeping the TAPS only....

Sunday, February 8, 2015

Lawsuit: Tennessee: use of students' academic standardized test scores in teacher evaluation for "non-tested" grades and subjects

from: http://blogs.edweek.org/edweek/teacherbeat/2015/02/new_nea_suit_in_tenn_challenge.html

Education Week's blogs
Teacher Beat NEA Lawsuit in Tennessee Challenges Evaluations of 'Non­Tested' Teachers
By Anthony Rebora on February 5, 2015 5:00 PM
Guest post by Anthony Rebora

The National Education Association's Tennessee affiliate today filed a new lawsuit challenging the state's use of students' standardized test scores in teacher evaluations, this time focusing on the system's effects on educators in "non-tested" grades and subjects.

Under Tennessee's much-watched evaluation system, unrolled in the 2011-12 school year, student test scores are factored into teachers' overall results through a statistical framework known as the Tennessee Value-Added Assessment System that seeks to isolate educators' impact on student-achievement growth [emphasis added; (philosophically, this sounds like a good idea)].

Teachers in tested grades and subjects receive individual value-added scores that count for 35 percent of their overall evaluation score [emphasis added]. However, teachers in non-tested grades and subjects—more than half the educators in the state, according to the TEA—are given composite, school-based value-added scores (generally derived from students' scores in the tested subjects) that make up 25 percent of their evaluations [emphasis added].

The TEA's suit, which will be litigated by the National Education Association, names as co-plaintiffs two educators in non-tested subjects—a middle school visual arts teacher and a middle school physical education teacher—who say their evaluation scores dropped as a result of their school-based value-added scores. On account of their evaluation outcomes, the TEA says, one of the plaintiffs was denied a bonus, while the other lost her eligibility to be recommended for tenure [emphasis added].

Teachers in non-tested grades and subjects, the suit argues, are "being evaluated substantially based on school-level TVAAS estimates that do not reflect the contributions of these teachers to their students' learning in the courses they teach [emphasis added]. ... In fact, these school-level TVAAS estimates provide no indication at all as to the quality of the instruction of a particular teacher [emphasis added]."

The TEA contends that the system violates the educators' due process and equal protection rights under the U.S. Constitution.

In response to the suit, Tennessee's top education official defended the state's use of student-growth measures to evaluate teachers.

"Teachers are getting more feedback than ever to help improve their classroom instruction, and ultimately, student learning," Candice McQueen, the state's recently appointed commissioner of education, said in an emailed statement. "We see evidence that this is working; Tennessee students are the fastest improving in the nation. The department remains committed to providing meaningful feedback to teachers based, in part, on student growth."

However, Douglas N. Harris, an associate professor of economics at Tulane University in New Orleans who has studied teacher-evaluation approaches, said that the issue of scoring teachers in non-tested grades and subjects remains a key sticking point for state evaluation systems that seek to incorporate student-achievement growth.

In particular, Harris said, use of school-wide value-added scores is widely regarded as "blatantly unfair" because educators are seen as "being evaluated based on the performance of other teachers [emphasis added]." "On this issue, the union has a very good point," he said.

Comments: this is just but one of many flawed versions of teacher evaluation systems created by politicians.  Similar to the new system created in Georgia, the laws are created, passed, implemented, and then after enormous input from those affected by the poor legislation, adjusted to calm the cry of those affected.  Specifically, the Student Learning Objectives were created (I helped create the middle school SLO for band), implemented, and used as part of the TKES evaluation system for this current school year (after a year or more of piloting).  Before 5 months had passed, due to the outrage, the Georgia politicians reversed the decision to have the SLOs impact the overall teacher evaluation score.

For politicians in general and Tennessee in particular, they should know (at least intuitively) that 80% of the student achievement factors come from the home environment, 6.6% from school-level factors, and 13.4% from teacher-level factors (Marzano, 2000).  When I first saw those statistics from a DVD with Dr. Marzano presenting in one of my doctoral classes, I was devastated.  I suspected a similar relationship of the three factors, but never to that degree.  Dr. Marzano immediately pointed out that the teacher-level influence has twice effect the as the school-level impact; that added some comfort, but not much.  Essentially, through the use of effective instructional strategies the teacher can offset some of the student-level factors, but the 20% of influence on student achievement will never overpower the 80% influence.

As such, it seems out of line for Tennessee (and other states?) to calculate 25% of a "non-test" teacher's evaluation score for 0% influence on a student's academic achievement due to other teachers and school-level factors (6.6% influence).

America, and the states, run as a democracy (which is a good thing).  The people who vote decide those in political office.

Marzano, R. J. (2000). A new era of school reform: Going where the research takes us. Aurora, CO: Mid-continent Research for Education and Learning.

Friday, January 9, 2015

Expected legislation

A friend sent me this information.  The highlighted areas are from the e-mail to indicate which items are concerned with education.
There are lots of other bills that are expected to be introduced.  Here are some of the more talked about proposals.
  1. Gun Bill—gun advocates still want the opportunity to carry guns everywhere including college campuses.  Having seen some of the officiating calls in college football games this year, if this bill passes football officials may have to wear bullet proof vests!  It is anticipated that HB 826 will be reintroduced and gives school systems more flexibility in dealing with weapons that are not used in a threatening manner.
  2. Expect some changes to TKES and LKES.   Legislators on the education committees heard an ear full at the listening sessions around the state.
  3. Education Committee members are committed to changing the Status Quo option that systems have as a choice effective July 1, 2015.  The SBOE is already making changes to the IE2 and Charter System options via rule changes.
  4. PBIS is gaining wide support throughout the state and you can anticipate that it will be funded at a higher level this year.
  5. Many legislators are tired of hearing complaints about integrated math and you can anticipate that there will legislation aimed at returning us to discrete math whether or not the SBOE changes the rule.
  6. One bill that will receive significant attention will be introduced by Rep. Mike Dudgeon of Johns Creek.  Rep. Dudgeon has announced that he will introduce a constitutional amendment that will call for the election of the State School Board by congressional district with the elected school board given the responsibility to appoint the State School Superintendent.  Constitutional amendment legislation if passed will not appear on the ballot until November 2016.
  7. The Governor will appoint a commission to study the funding formula for education.  (Purdue did the same, they studied it for five years and all they came up with is IE2.)  It will be an arduous task to replaced QBE and maintain equity and fairness.  This one could get tricky!
  8. Hopefully Common Core is an over and done issue.  The listening sessions, surveys, and defeat of Tea Party challengers should put an end to the common core challenges.

Thursday, January 8, 2015

Steps to prepare for your observation

As I become more aware of my responsibilities for the TKES evaluation, I am trying to become more efficient (to reduce my stress!).  For my next evaluation, I followed this process:

  1. I decided to make my lesson plans according to the TKES Lesson Plan design I created last year (see sample here) and make sure all drop down fields were accurate and up to date.
  2. I typed up what will go on the SmartBoard for the students to know what we are doing in class that day.  I copied and pasted the student version to a new page in the same word processing document.
  3. I slowly read through all 72 TAPS elements, and  as I saw an Element that may apply to my lesson that day, I inserted the outline number of the Element (say, 2.6 or 4.5) into the SmartBoard outline that I will give to the Evaluator.  
  4. After reading all Elements, I have two documents that I  give to the Evaluator: a formal TKES lesson plan, and a student version of the lesson with the Elements identified.
  5. I still have my TAPS notebooks with evidence in the cabinet as well as just the outlines of my evidence in my lesson plan notebook in case the Evaluator wants to see them.
What I noticed is that even some of my reminders to the students about upcoming activities or after school events can be tagged with an Element.  An after school practice, rehearsal, or event can count as 4.2, 4.3, &/or 4.6.  Reminding them about upcoming tests or units could count as 2.6, 3.2, &/or 4.6.

Sample portion of student version of lesson given to Evaluator
Give yourself a grade (1-5) on your progress today! [6.7] 6th grade: Brass: Lip Slurs [1.4, 1.6, 3.2, 4.1, 4.5, 4.6, 5.3, 6.1, 6.3, 6.6, 8.3, 8.5, 8.7]; Bells: Octave/Chromatic [1.4, 1.6, 3.2, 4.1, 4.6, 6.1, 6.3, 6.6, 8.3, 8.5, 8.7]; Flutes: Aperture control [1.4, 1.6, 3.2, 4.1, 4.5, 8.3, 8.5, 8.7]; Clarinets and Saxophones: register slurs to determine embouchure, tongue placement, amount of mouthpiece, reed quality, air stream, hand position [MMSBB.2.b, MMSBB.3.a] [1.2, 1.4, 1.5, 1.6, 3.2, 4.1, 4.5, 4.6, 6.1, 6.3, 6.6, 8.3, 8.5, 8.7]

These steps are also outlined in "How to pass all 10 TAPS in one lesson" and "District Walkthrough"  here.

Friday, December 26, 2014

Read This: Your TAPS Evaluation Score is up to You--Not Your Evaluator!

It is becoming apparent that my early predictions a year ago of defending your job by collecting, organizing, and updating evidence in notebooks (called paranoid back then) is becoming reality and possibly a necessity.

Your Evaluator may only look for as much evidence as they want to, take time to, or have time to - and then give you a score.  The score may not be a huge concern to them (even if it is a 2 or 1) because the score is not directly related to their certificate.  The teacher's evaluation score is up to the teacher, not the Evaluator.  Let me explain.

At the mid-year evaluation a friend of mine received a 2 on TAPS #6.  The Evaluator looked at two students' grades in the electronic grade book and out of 9 summative grades, those two students had either turned in none or one.  The Evaluator concluded that the teacher was not assessing well and summarily gave the teacher a 2.

The REAL story is that our of 30 or so students in the class, the Evaluator picked the two students who are ill-behaved, in ISS (in school suspension), frequently absent, and/or are two of the most troubled students in the school.  The Evaluator did not look at the entire class' grades, average, or completion status (which s/he was able to do), only the two "problem children."  Now, to me, that is either trying to be a "gotcha," a vendetta of some sort, ill-trained, or incompetence on the part of the Evaluator.  As people say, "That ain't right."  The TKES/TAPS process has been presented as a "totality of the evidence," but the evidence has to be reviewed first.

As a result, the teacher, in his/her defense (and anxiety) had to spend quite some time point out to the Evaluator other students in the same class.  The project turn-in rate and grades were quite high in fact.  Seeing the data, the Evaluator changed the 2 to a 3.

If you have ever been in a position where you had to regularly defend your job, you know the stress it creates, the morale it devastates, and the decrease in your effectiveness.  It creates a terrible work environment - especially mentally.  That teacher looked defeated.

However, in my mid-year conference, my Evaluator indicated that s/he had reviewed some of the evidence I had uploaded into The Platform (I think I have scanned and uploaded about 300 items).  That sounded good to my ears.  To me, that is an indication that s/he is trying to review the totality of the evidence, is doing his/her due diligence, and if there is a question, we can refer to it during the conference in The Platform.

I urge the effective teachers of this state to collect past and present evidence for the TAPS elements, organize it in some fashion that can be easily accessed, and take the evidence to meetings.  If you need suggestions on how to collect evidence, organize it, present it in notebooks, or uploading it into The Platform online, please see earlier blog posts of mine--also review my blog "How to pass all 10 TAPS in one lesson."  If you are going to the GMEA convention in January, stop by the poster presentation session or the Friday evening concert and let's talk.  We'll talk TKES.

Sunday, December 14, 2014

Substantial Poverty in Georgia...is there anything teachers can do?


  • (Dusen, p. 6)"About 57 percent of the students in Georgia's public schools are considered low-income...."  "With 27.2% of the state's children living in poverty, Georgia now has the 6th highest childhood poverty rate in the nation. U.S. Census Bureau, American Community Survey Profile, September, 2013"
  • (p. 7) "...87 percent of the school districts in Georgia serve a majority of low-income students."  "...the 2015 state budget includes a $350 million increase in education funding, but...this will do little to reverse the $8 billion in austerity cuts they've suffered in the past decade."  "'People recognize that poverty has increased, but I don't think they understand the full impact it has on the child,' says Reada Hamm...."
  • (p. 8) "Poverty Rates of Georgia's 5 Largest Cities: Augusta: 27.4%, Macon: 27.4%, Savannah: 20.3%, Atlanta: 18.9%, Columbus: 18.5%"  
  • (p. 9) "More than 1 in every 4 Georgia children are food insecure--28.8%.  That's more than 700,000 children under the age of 18."
  • and again:  (p. 12) "87% of the school districts in Georgia serve a majority of low-income students."

As a teacher in a low-income county and school, the effects of low-income (low-socioeconomic and poverty) are real.  In Georgia, students are tired because they are taking care of siblings when they get home, students do not know when the lights may go out or the water will be turned off, parents are taken to jail, students live in homes where multiple families are sharing inadequate space, mothers have multiple boyfriends, students come home to find their belongings on the front lawn, students may not even live in a home but a storage unit.... (Dusen)

I may not be able to make changes to "the system" so that poverty is eliminated, but I can make a difference for those students in my classroom.  I found this out: A teacher can offset some of the factors of low-income situations (Cason, 2011).  My doctoral research was driven by the fact that the students in my Title 1 school were not achieving as high as the students in the higher-SES schools in the same county.  This should not be the case!  We have the same curriculum, same lesson plan design (which was the problem), and good principals, teachers, and facilities.  Why is there a disparity?

Having learned, through research, the effects of poverty and low-SES environments on students and their families as they relate to getting an education in public schools, I turned my focus on the area(s) where a teacher can specifically work to counter-act some of those effects so that students from low-SES settings can achieve.

To review the entire study, including problems, data, research, findings, and literature review, please refer to this link: Doctoral Study.  To review just the specific doctoral project lesson plan, click, "GC-Doctoral Study Project-Presentation here.  The lesson plan really makes a difference in student achievement (F(1, 863) = 35.398, < .000).  In simple terms, the students in the classrooms where the teachers who used the lesson plan scored at least a half letter grade higher than the control group.  The lesson plan is adaptable to all grades, all subjects, all levels, and is compatible with TKES.

Comments from the web site:
In this section I have attached my doctoral study entitled "Activating Prior Knowledge With Cues and Questions as an Key Instructional Strategy to Increase Student Achievement in Low Socioeconomic Middle Schools" for your review. 
Using archival data, this ex post facto study found a statistically significant difference using an ANCOVA, F(1, 863) = 35.398, < .000, for the research question investigating the effect on student achievement when teachers specifically activate students’ prior knowledge before using the LFS model of instruction.
The resulting project from my doctoral study was a lesson plan design that incorporated activating students' prior knowledge before starting the main learning goal(s).  Prior knowledge is a critical component of learning new material, concepts, or skills; unfortunately, it is often overlooked in a rush to 'get on with the lesson.'  
"Curriculum coverage is not synonymous with learning" (p. 3)
Teachers, even though our evaluation system has changed and our stress has (probably) increased, teach.  Use the best instructional strategies (document 1), the best methods (document 2), specific goals, and direct instruction to offset some of the effects of poverty and low-socioeconomic situations.  Teachers can use the lesson plan in non-poverty areas as well.

References
Cason, M. G. (2011). Activating Prior Knowledge With Cues and Questions As a Key Instructional Strategy to Increase Student Achievement in Low Socioeconomic Middle Schools. (Ed.D. 3469058), Walden University, United States -- Minnesota. Retrieved from https://sites.google.com/site/gcason123/doctoral-study  

Dusen, C. V. (2014, August/September). The Growing Face of Poverty. PAGE ONE, August/September, 32.

Monday, December 8, 2014

Zero TKES/TAPS Stress Now...

I must say, that after realizing how I could document and pass all 10 standards on a day where I was not even teaching has been very liberating.  I am not going to upload any more documents into The Platform.  I am not going to save copies of any more grade reports or data samples.  I am not going to update my notebooks.  I am not going to worry about any more evidence.

If you have not seen the blog post, handbook, or PowerPoint on what I did, I recommend you take a few minutes to view it--it will save you time.

I incorporated the lesson plan from my doctoral study (Doctoral Study Lesson Plan) with the TAPS Standards and Elements and made notations in my lesson for the Evaluator on how I was meeting each Standard that day.  I think it is comprehensive and convincing.  A friend of mine, who hopes to be a principal soon, said that s/he would use it to help prepare his/her teachers for TKES.

The handbook and PowerPoint can both be found on my website on this page: GC-District Walkthrough-PPT.  I prefer the PowerPoint for viewing...

Respectfully,
Glenn

Thursday, December 4, 2014

TKES Conversation-January 2015

If you are going to the Georgia Music Educator's Association state convention in Savannah January 29-31, why not stop by have a conversation, share ideas, and discuss TKES?  I'll be at the poster session presenting the results of my doctoral study: "Activating Prior Knowledge with Ques and Questions as a Key Instructional Strategy to Increase Student Achievement in Low Socioeconomic Middle Schools."  The result was a lesson plan introduction that dramatically increased students' achievement compared to the teachers who did not use the lesson plan.
Using archival data, this ex post facto study found a statistically significant difference using an ANCOVA, F(1, 863) = 35.398, < .000, for the research question investigating the effect on student achievement when teachers specifically activate students’ prior knowledge before using the LFS model of instruction.
The lesson plan from my doctoral study is used in my TKES lesson as well.

I will have copies of my "Saxophone Handbook" for you to review, "How to pass all 10 TAPS in one lesson," and other items that could assist you in a) teaching, b) TKES, c) saxophone instruction, or d) lesson plans.  All of the information is free.

I will be performing at the Friday night concert as well; it should be a good program of music.

Hope to see you there.