Thursday, June 9, 2011

The Cumulative Final Exam

Please excuse the use of quotations, and please treat all quotations as if I'm using my hands to give the quotations signal.

Scenario A: Teacher speeds through all course material and is "done teaching" for the year by May.  They spend the remaining 1.5 months "reviewing" for the final.

Scenario B: Teacher spends time on each concept, working at the student pace to ensure everybody has a good understanding of each concept, differentiating when appropriate.  By the time June rolls around, there are 2 whole chapters (when did learning take place in chapters) that were not "covered". 

Since they both teach the same course, each teacher's students will be taking the same final exam.  After all, what better way to achieve a standardized curriculum than to make the final exam the same for everybody.  What will these final exam scores look like?

The teacher from Scenario A looks like a superstar.  An expert teacher whose students excel on the final exam.  A true veteran.  The teacher from Scenario B looks like a bumbling fool, a novice who can't even "cover their entire curriculum".  This is the teacher that others look at and say "They move too slow." (not sarcasm quotes)

The students from Scenario A look like the best and brightest, and are rewarded by high scores on a final exam that counts for 20% of their final grade.  Yes, 20%!!!  90 minutes of a school year equates to 20% of a student's grade.  The students of Scenario B are punished for having a teacher who cares about everybody learning, and a teacher that is following a district model of formative assessment and differentiated instruction.  They will not have high scores on their final exam, the same final exam that Scenario A's students beasted.

How does it make sense to give the same final exam, knowing that every teacher employs a different teaching style?  Why can't we have some freedom to develop our own final evaluation of what a student knows?  Something that is not necessarily a multiple choice exam.  Here's what I'd do: My End of Year Projects

Wednesday, June 8, 2011

Card Game: AP Stat Ideas

Below are 4 hands from the game "13" that my students now play all the time now that projects are getting finished and school is ending.  Apparently 2's are the best.  I don't know the rules, and I'm not certain they do either.  I believe you can play straights, multiple pairs of multiple numbers, I don't even know.  Do people besides students at my school play this game(i.e. do you know what a "cutup" is)?  An interesting way to analyze probability anyhow.  Here are 4 real hands dealt during a resource period.  Ideas...who will win?  who has the "best" hand




Wednesday, June 1, 2011

End of Year Project Topics

This is an incredibly interesting list of project ideas that my students have generated.  I thought I'd share what they're capable of when they have zero restrictions...

10-11 End of Year Project Ideas 

Some highlights if you don't feel like reading through the whole thing...
  1. Mythbusters: Mac vs PC
  2. Value of a Power Hitter or Contact Hitter in Fantasy Baseball
  3. Developing software that is usable exclusively in an AP Stat class
  4. Using a test vs. a qualitative measure to assess learning
  5. Profiles of countries in a state of unrest to predict revolt (the wiki for this project)
  6. Price differences between Ebay and craigslist
  7. The chance of finding a 15 to 64 year old male in Luxembourg that is a noble compared to a similar-aged male in Denmark being a noble.  (easily the most unique project ever done)
  8. Using Wikipedia, how long it takes for random words using the formula of always clicking the first blue, unbolded and unitalicized word, lead to the end word of Philosophy. On the theory that every word in Wikipedia will eventually lead to Philosphy. Words will be chosen via randomized dictionary words. (based on the fact that clicking the first blue, unbolded, unitalicized word of every page eventually leads to Philosophy)
With a little bit of freedom (okay, a lot of freedom) they come up with some awesome topics and they really enjoy their work in class during these couple of weeks.

How will I grade them?  By writing a list of what the student did well and what the student did not do well. 

Friday, May 27, 2011

A Farewell to Teaching

I'm writing this post to inform everyone that reads this (who knows how many) that at the end of this school year I will be making a career change.  I will be starting down the career path to becoming an actuary, which is a unique opportunity that I am extremely excited about.

I'd like to thank anyone that has taken the time to read this blog for allowing me to share some ideas about education that are unique.  I've continued to blog this year due to the amazing conversations that I've had with colleagues nation wide.  Most of you I've never met face to face, but I enjoyed having an audience to bounce some AP Stat and just general education reform ideas off of.  This is something I would not have done with anybody but my personal learning network on the internet.  Again, thank you.

Feel free to use any of my ideas in any capacity that you wish.  No need to be proprietary, as I'm all about moving education forward.  I encourage you to consult with and contribute to the class wiki that we've started this year, that will hopefully evolve into an online statistics resource for any and every statistics student that would be in need.

The Class Wiki: http://statknowledge.wikispaces.com

Will the blog continue?  Probably.  Just in a much different capacity and probably be more math nerd related than anything.  Again, to my PLN, thank you so much for contributing so much to my professional development. 

Thursday, May 26, 2011

School Reform through the Eyes of Albus Dumbledore

"No spell can reawaken the dead, Harry. I trust you know that. Dark and difficult times lie ahead. Soon we must all face the choice between what is right and what is easy." - Albus Dumbledore, Harry Potter and the Goblet of Fire.

I can't help but think of this quote when it comes to education reform, so I'm going to list how things are done the right way and how things are done the easy way in various realms of education.

Implementing Change in a School District
The Easy Way: Write a policy
The Right Way: Have your buildings/staff make the necessary changes they need to help students learn

Technology Integration
The Easy Way: Buy some Smartboards and show the community our commitment to technology
The Right Way: Invest in technology that changes a teaching practice, not technology that isn't much different from an overhead projector

Community Relations
The Easy Way: Cave to whatever demands are placed upon you by the community, tell them you'll do everything they want
The Right Way: Stand up for something and explain to the community how it benefits your district, even if it means spending some money

Instruction
The Easy Way: Hand out some worksheets from a textbook manual and have students follow and copy down what you've done
The Right Way: Find out what each learner needs, and provide them with as much support for those needs as you can

School Structure
The Easy Way: Force all students to assimilate in to one model of what a school is
The Right Way: Provide students with several different options to attend school (brick and mortar, cyber, blended, etc) and have them choose which one works best for them

Grading
The Easy Way: Give points, then slowly and methodically remove them for each successive mistake a student makes. 
The Right Way: Have a list of things that a student does well, and some things they must improve upon.  Constantly revise and add to this list throughout the year.

Grading II
The Easy Way: Use a formula to calculate some type of average to summarize what a student can do.
The Right Way: Students have portfolios, that you can go through and look at to see their strengths, weaknesses, and interests. 

Personnel
The Easy Way: Make decisions based upon tenure/seniority/contract status
The Right Way: Make decisions based upon who is best for your district

Wednesday, May 25, 2011

A Project per Unit: Understanding By Design in AP

Some project ideas for AP Statistics that are unit specific.  I'd begin each unit by asking these questions and having kids consider just what data they'll collect and how they'll answer these questions.  Then build all the statistics concepts around them as they come up, not as a series of "concepts".

Unit I: Displaying and Describing Data
1.  Market Research: design a product, determine the market for it, determine how much you should charge for it
  • Create a coffee stand for your school building
  • Pillow-pack: A backpack with a pillow built in to it
2.  Consumer Education: Choose a series of similar products and determine which one is the best for consumers
  • I have a number of kids working on an end of the year project comparing all types of smartphones
  • Mac vs PC?
  • Tablet PC's
3.  What makes a song popular?

Unit II: Displaying and Describing Bivariate Data (probably my weakest ideas)
1.  Have them collect data that makes them think about what a correlation actually shows (not causation, only a relationship between two variables, etc)
  • Caloric Intake vs Weight 
  • Grams of Fat vs Grams of Protein consumed daily
  • Income Level vs Achievement Level in Standardized tests (state tests, SAT's, AP, etc.)
Unit III: Collecting Data
1.  What's the best sample we can get to investigate one of these questions?  I want them to see just how poorly they collected data in Unit I and II
  • What proportion of the school district has internet access?
  • Revise their sampling methods from previous units
Unit IV: Probability
1.  Design a game of chance that is profitable (run them all together on a casino day complete with fake money to see how profitable it is in the short-run)

Unit V: Inference for Proportions
1.  Continue any of the data collections that were done in the beginning of the year, looking for significant differences (consumer education, market research) - introduces significance, might be boring.
2.  Is there equality that exists between schools/institutions?  What do "richer" schools have that "poorer" schools do not?  Why?
3.  Tell a joke and determine whether or not it is funny.  Possibly look at a comedian's standup routine to determine if they are funny
4.  Type I and Type II Error: Explore wrongful convictions.  Have them explore some court cases that are under contention (OJ Simpson, Mumia, etc) and some court cases that are open and shut (Bernie Madoff).

Unit VI: Inference for Means
1.  Is there an advantage to eating a raw vegan diet vs the traditional "Western" diet? Inspired by recent events as we have recently started a Raw Vegan diet (and have never felt better)
2.  Is there equality between schools/institutions?

Unit VII: Inference when Variables are Related (T-tests for Slope and Chi-Squared Tests)
1.  Should you choose one letter over another when taking a multiple choice test? (sorry for so many testing heavy examples)
2.  Do a t-test for slope on any of the bi-variate data collected in Unit II

Most of the inferential statistics can be applied to enhance the projects and work done in the beginning of the year.  See my post on Inferential Statistics Data Collection for others, and of course feel free to lend me your ideas.

Wednesday, May 11, 2011

Inferential Statistics: A Different Approach

For the longest time I've given thought to providing instruction on inferential statistics in a unique fashion.  If you're an AP Stat teacher, it means a departure from the One-Proportion Z-test, Two-Proportion Z-test, One Sample t-test, Two-Sample t-test, Matched Pair t-test, Chi-Squared Test(s), t-tests for slopes of regression lines.

So here's how I'd start...all data collection.  Spend a couple of days collecting data for each situation.  One of the essential questions I'd like my students to explore throughout the year is "Which model is the most appropriate for data you have collected?"  Here's where we go into depth about why certain models are more appropriate than others...

Data to Collect

  1. Number of victories in 100(or so) games of Rock-Paper-Scissors
  2. Toss a thumbtack and record proportion of "up" 
  3. Drop a piece of buttered toast 50(or so) times and measure how many times it lands "buttered-side down"
  4. Give a dummy homework assignment and measure the proportion in each class that complete the assignment
  5. Compare batting averages of two baseball players
  6. Time how long it will take kids to walk to the pool and back
  7. Prices of items at clothing stores (found through browsing catalogs online)
  8. Number of each type of animal cracker per box
  9. How long it will take you to sort beans on to bulls-eyes with a dominant/non-dominant hand
  10. Give the ol' Memory Experiment(groups rate sentence on how hard they are to pronounce/how easily they can form a vivid mental image) and compare number correct for each group
  11. Count the number of each color of M&M you receive in a sample of M&M's
  12. Change drop-height/rotor length of paper helicopters and record the time it takes to fall


After you spend about a week or so doing data collection, ask students to reflect on how data was collected. Notice also that some activities are done the same way (measuring proportions/means).  I'm fairly certain this has to be done to guide reflections, make kids confused, and ultimately learn something about making generalizations (mathematical modeling at its finest).

Ideas for Reflection:

  1. What was measured in each data collection?  How does it compare with other types of data?
  2. Which activities were useful for making comparisons?
  3. If we're not making a comparison, what can we do with the data we collect?
  4. Does it matter than some samples are smaller than others?  
  5. Create a display for each activity with the raw data.  Which models tend to be the most appropriate?
I can see this being two weeks of AP Stat where kids think about collecting data and fitting similar models to similar methods of data collection.  Once they start fitting models to each situation for comparison, then you bring about some hypothesis testing procedures.

If you're a Stat teacher or not, provide your suggestions and ideas for data to collect.  It'd be great to get a new type of data to collect from somebody outside of the Stat realm.

AP Stat Test Today

I am sneaking this in before the 48 hour moratorium on discussing anything AP Statistics test goes in to effect.  Hopefully the gentleman in dark suits and sunglasses from the college board that I see lurking outside of my window will understand.

Today my students will take the AP Stat exam.  I have no idea how well they will do.  I'm fairly certain that the students that this test wants to get a "5" will get a "5".  I've told my students that they are much more important to me than a test score.  After all the amazing things we they do in class, it seems rather anti-climactic to summarize them with a number between 1-5.

One of the worries that I have is that I spend most of my year allowing my students to demonstrate what they've learned in whichever way they wish.  Some like to create videos, some create Prezi's, others are content to write a report.  At the bottom of it all is my belief in giving kids a choice in what they wish to do, so that they can enjoy a learning experience other than "listen-do this-receive grade".

But the AP Stat test will not take in to account the freedom of choice and "uber-differentiation" that has made my classroom something everybody enjoys.  I do think that the students who have chosen the most in-depth learning experiences will translate their knowledge to any medium, even if it's some boring old test.  What about the kids that chose to write a report, or chose a learning experience that wasn't all that in-depth?  If I told them to choose something else that would be better in terms of test preparation, then I wouldn't really be giving them their choice at all.

With such a push for meeting individual student needs, how can we only give students 40 questions with 5 different answer choices to show what they know?  On the free response section, how come they have to answer parts (a), (b), and (c) when the most enjoyable part would be to explore option (d) that doesn't exist?

Next year, I'm going to suggest that none of my students take the AP exam.  There...I said it.  Our reasoning will be that we are going to do something better than answer someone else's questions, we're going to answer our own.

Friday, April 29, 2011

Why Schools Do Not Innovate

This post is in response to Scott Swindells' post of "Where's the Innovation?".

A recent blog post by Seth Godin really resonated with me as to a school district's reluctance to innovate.  He writes:

It's impossible to have a coin with only one side. You can't have heads without tails.

Innovation is like that. Initiative is like that. Art is like that.

You can't have success unless you're prepared to have failure.

As soon as you say, "failure is not an option," you've just said, "innovation is not an option."
This is precisely why most schools systems are institutes of non-innovation. Innovation within a school system(okay, within all systems) frequently involves money.  A district doesn't want any money they spend to result in a failure, since taxpayers demand a constant "positive return on investment" (that phrase mixed with education makes me cringe, so I'll only use it once).  Nobody wants to anger our taxpayers by spending money on something that fails.  
This is the failure is not an option approach.

Wednesday, April 27, 2011

Who decides what kids should learn?

Stop me if you've heard these before...
"Kids these days can't do simple math without a calculator!"
"Kids these days can't write well at all!"
"Kids these days are lazy!"

So what?  What gives you the right to tell students today what they should learn?  You've never met my students, so in my view you have zero authority to tell them what they should be learning.  On the flip side, if you've never met my students, why is it okay to tell them what they don't need? (as you begin making those budget cuts to eliminate foreign languages and the arts)

Ultimately our kids should have the freedom to learn whatever they want.  There should be no reason that every student should have to take Algebra II before they graduate high school.  If they're interested in it or would like to try mathematics, then go for it.  If they have an interest in art, why would we then tell them that they can only take one art course this semester since they have to take 6 other subjects they don't care about?

Imagine if we required every student to take a painting and drawing class every year from 7th grade to 12th grade.  Why does that sound so blasphemous, yet we can easily require them to take a math class (or two) every year from 7th grade to 12th grade?

Don't get me wrong, I see the great benefits in students taking any mathematics courses.  I'm a math teacher.  I want kids to discover their own interest in learning mathematics on their own schedule, not on some mandated timetable. 

Friday, April 15, 2011

AP Stat Lesson: Type I and Type II Errors

THE EXCEL SPREADSHEET
Type I and Type II Errors (housed on Box.net...is there a better way to do this?)
Directions for the activity contained in the spreadsheet.

SKILLS ADDRESSED
Statistical significance, Confidence Intervals related to hypothesis tests, Type I Error, Type II Error, Power, Alpha, Beta

THE CONTEXT
A factory is producing pharmaceutical grade glass vials.  Quality control engineers are employed to see if the factory is producing items at or below the industry standard of 5% defects.  They conduct a sample of size 100 (Mistake #1: I know this violates np>10, but it turns out that you wind up failing to reject a lot and it leads to a good understanding of Type II error) and determine the proportion of their sample that is defective. (Mistake #2: sampling 100 items and getting a proportion defective of 0.063 defective is impossible.  They will need to round to a whole number of successes when using the graphing calculator.)  Based on the results of this hypothesis test, they will decide if the factory must undergo a quality control review or continue with business as usual.


THE ACTIVITY
1ST PART - CONDUCT THE TEST, DECIDE WHETHER TO REJECT OR FAIL TO REJECT
It's dynamic.  Each kid will receive a randomly generated proportion.  They may do this up to 50 times.
First part of the activity: conduct the one proportion z-test using your graphing calculator.

2ND PART - DECIDE IF YOU MADE THE RIGHT DECISION
Unlock the spreadsheet (password: apstat5).  Have them change the fill color of the "True Proportion" column to reveal the true proportion of items that are actually defective.  They then evaluate their decision as to whether it was correct or incorrect.  Cue a whole class discussion on the 4 different scenarios of errors, then slap the AP Stat vocabulary on.

REFLECTION
Two huge mistakes that led to an amazing understanding of errors.  An overly planned lesson would have avoided these mistakes.  It also would not have generated a discussion on appropriate assumptions and conditions for inference.  An overly planned lesson would have also not brought up the question of "How come I'm failing to reject so much when it's false?"  An awesome comment: "I would not have learned this that well if I didn't have to think about those things."

This was 3 days worth of 46-minute classes.  Let's see how they do on the assessment of these skills.

THE SLIDESHOW


Coming soon to this post...
The Google Form Assessment
A better version of the spreadsheet that allows any null hypothesis and any sample size.
100 comments on how to make this even better (hopefully)

Thursday, April 14, 2011

Good Teachers are Worth their Weight in Gold

The popular line from those that wish to criticize teachers is that "Good teachers are worth their weight in gold".

Every person that says this follows it up with a list of excuses for why we should pay them in dirt.

Treat teachers like they're providing your students with an education.  Treat teachers like people that are developing children into citizens.  They're not being asked to peddle some wares, manufacture something, or generate more money.  They're being asked to nurture and develop citizenship.

A teacher can make one child a great citizen, and in doing so they would outshine all the salesmen in the world.

Thursday, April 7, 2011

AP Stat Lesson: Confidence Intervals (Graduation Party)

The Excel file: Graduation Party simulation.

What this Excel file does is simulates a student sending out 1500 invitations to a graduation party.  There is a true proportion of people that will attend, but it is unknown (see the "Population" tab of the Excel spreadsheet is completely blacked and password protected).  If you'd like the unlocked version, feel free to get in touch with me and I can send it along.  Students will conduct samples of 20, 50, and 100 to estimate the true proportion, and once they've generated a sufficient number of each sample size, they'll take a guess as to what the true proportion is.  Discussion follows as to which sample was most helpful to make the guess from.  Most guesses are that the true proportion is between 0.2 and 0.3.

They choose one of their sample proportions for sample size 100 and create a confidence interval for 4 different confidence levels: 68%, 90%, 95%, 99.7% (not randomly thought up by any means).  I chose these confidence levels because in the past I've seen students not associate confidence intervals with a middle percentage.

Collect students intervals using this form: One-Proportion Z-intervals Data Collection Form 
Display their responses here: One-Proportion Z-Intervals Raw Data (pay attention to both tabs, one has intervals and one has whether or not the interval captures the true proportion)

Once they've done some thinking, they will open this Excel file(One-Proportion Z-intervals Displays of Each Interval), providing a visual of each confidence interval.

We follow with having students lead their own discussion.  They'll begin by posting comments to a specific page of the class wiki, in order to get them to jot down an initial reaction to see if what they thought still holds up, or if their thinking needs revision.  A whole class discussion follows, and I challenge them to not allow me to speak for 10 minutes.  This can be difficult for me, but it is extremely difficult for them.

Wednesday, April 6, 2011

AP Stat Lesson: Unstructured Investigation

Give kids this: Hank Aaron - Home Runs by Pitcher

Let all hell break loose.

I've become quite a fan of "unstructured time" as of late, and I think this is perfect for an AP Statistics class.  I can see my baseball fans in class leaping at this opportunity to explore some baseball stats.  Whatever they wish to investigate, they are free to do so.  For the non-baseball fans, it may be just an opportunity to learn something about baseball.  I owe them some data on what they're interested in.  I look forward to hearing what a student that knows very little about baseball has to say.

Possible investigations:
1.  Comparison of Barry Bonds(or any other great home run hitter) to Hank Aaron. 
2.  Are pitchers today better (as a whole) than the ones Hank Aaron faced? (this comes from listening to Colin Cowherd say that Babe Ruth hit his home runs against guys who drove a milk truck in the off-season)
3.  Just how biased is this website?

Let them come up with how they're going to do any of the above. 

Tuesday, April 5, 2011

How does change happen?

First of all, check out the new blog format.

Second of all, today we played the Making Change for School Improvement game with the fellow Montgomery County instructional coaches and it was awesome.  Our group successfully moved every teacher to becoming a routine user, but ran out of money to perform a complete curriculum revamp. .  Amazingly, we still had plenty of funding to give our kids standardized tests (kidding).

What I left my meeting with today is that I don't talk to nearly enough people in our district.  I have a great rapport with our teachers.  I am friendly with our administrators.  It's REALLY hard to have a conversation about things that need to change with administrators though.  As a whole, I feel like preservation of the status quo can sometimes be more important to them.  Or, the need for change is viewed as such a huge problem that mere mortals are powerless to do anything about it.

It's become necessary for me to provide data at every waking moment to support what we're doing as instructional coaches.  I'm the one responsible for providing it, when there's a real easy way to collect that data.

If you want to see the effect that an instructional coach is having, GO INTO A CLASSROOM  and see.  Look at what is happening in the classroom first, then decide what additional data (if any) that you need.  As a statistics teacher, my recommendation is to gather data to assist in making an informed decision.  Please try to avoid gathering data to support an argument for/against.  It turns litigious and confrontational.

I had the pleasure of speaking with @kenrodoff about how his administrative team walked through classrooms for 6 hours as part of an ISTE site visit and was amazed at what was happening with technology. Hopefully this leads to a continued and improved support for instructional technology within their district.  Honestly, I have no doubt about it.

My classroom door is wide open for anyone that would like to visit.  My students and I would love to share what we're doing with you.  I have no reservations and I don't get scared when an administrator walks in to my room.  I want you to come in.  I want to share.  I want to be seen because I believe what I'm doing is in my student's best interest.  I am proud of the fact that what happens in my classroom is different than every other classroom in the district.  I have no idea how well my students will do on an AP Exam, in fact, I don't really care.  I care that my students are producing something they genuinely care about and are interested in.

Here's some of the great things happening:
Probability presented through Penalty Kicks
Binomial vs Geometric Probability: A Jimmer Fredette Example
Experimental Design: Pokemon and Smash Bros

Friday, March 25, 2011

PA Education Budget...Am I reading this correctly?

To all my fellow educators in PA, readers in PA, and educators nationwide, please visit the link below and open the Excel file.

PA Dept of Education Budget

Check out line 10, the line for "PA Assessment".  Yes, that 36, 590 is in thousands.

Compare with line 9, the line for "Information and Technology Improvement".  4,266.

Pick jaw up off floor.

A statewide system of standardized testing is costing us $36, 590,000, yet they are calling for teacher layoffs?

Am I reading this correctly?  Please tell me if you have a different interpretation. If I am reading this correctly, then every person in the state of PA needs to see this.

Big props to Scott (Tuesday's With Swindy) for pointing me towards this.

Wednesday, March 23, 2011

You Say It Best When You Say Nothing At All (about Hypothesis Testing)

I really like sharing things that work really well.  I don't know if it's a good idea on it's own, but this worked great considering we've done a lot of remediation (for students that needed it) and a lot of projects that delve deeper into statistics.  They are working with basics of statistics all the time, so tying it together into making statistical inferences is fairly easy when they have a good foundation.  I'd also like to think it has something to do with the way we've interacted with hypothesis tests in class.

Thumbtacks - Introduction
It begins with this form (Inference for Proportions) and handing out one thumbtack.  In their own brain, students decide what they think it is and what it would take to convince them it was wrong.  Then they toss the tack to be able to compare their observations to a model they've developed.  Sounds a lot like your entire hypothesis testing/inferential statistics unit.

The NCAA Basketball Tournament - A Basic Example
Kids then made predictions for the NCAA tournament and we tested just how good they were at doing so by comparing their proportion of correct first round picks to randomly guessing (p = 0.5).  A big question that came up was "Are we just doing this for the first round?" and in true teacher fashion I said, "Yes.  Now how come we're only doing it for the first round?"  Cue a killer discussion about large enough sample sizes and the Success/Failure condition.

Back to Thumbtacks - Put it into practice
Fire up the laptops and open up what the rest of your classmates thought (What They Thought).  Immediately they began to think "Why did this kid think they were incorrect when they got a lower proportion that what they thought they needed to be incorrect?"...a not so formalized way to think about a standard of proof and a low enough p-value to reject the null hypothesis.  This was one of those points in class where I said nothing and let their brains piece together what they were looking at.  I clarified what we were looking at, asked them to pick case that they thought was theirs and test the original hypothesis.

Conclusions
On the board, write your p-value and whether or not you rejected your original hypothesis.  As a class we'll have a look at everyone's p-values and decisions, then decide who has correctly rejected/not rejected.  They all argue about what p-value is considered "low enough" that you have to reject.  One of those moments where again, I say nothing and they develop an understanding of alpha-levels.  Not so formal...yet.


Projects/Practice
Pick another one of those contexts from the Inference for Proportions form and investigate it.  I think I'm going to add some more situations/contexts.  I'm also not sure that they ever need to fill out that form more than once...


What's Left to Do?
Sit back, relax, and let the 5's on the AP exam roll in.  Dress it up.  Put all the formal AP Exam terms/vocab/stuff to what they've already understood.  Then....
1.  So what really is the true proportion (Confidence Intervals)
2.  Is what we got really that different? (two proportions)
3.  Repeat procedure for sample means instead of proportions

Feel free to go to our class wiki for any supplemental exercises/materials.

Wednesday, March 16, 2011

Homework? More like "1st Period"-work

At our school, the typical student will take 7 courses for the entire year.  If that student has AP Stat with Mr. C on their schedule, this means that a student has 6 classes in which they will have homework.  6 other classes with one hour of homework assigned each night means approximately 6 hours of homework.

Some careful accounting as to how those 6 hours that could be spent working on homework are actually spent.
Hour 1: Facebook
Hour 2: Call of Duty
Hour 3: Facebook
Hour 4: Facebook
Hour 5: Facebook
Hour 6: Being a 17 year old....and throw in a little Facebook

I don't think that I"m very far off for the general population of high school students.  Knowing that this is the truth, just why do we assign homework again?

Where do those hours get made up?  During the 6 hours they are in school, simply because that is how  you earn the points for homework completion.  Do it in AP Stat, show it in Pre-Calc, consume points, repeat tomorrow morning in AP Stat.  And it's Trigonometry Worksheets.  Is my AP Stat class really less exciting than trig worksheets?  Not a chance, but the homework "points" are more exciting as they appease the grade monger in all of us.

I'm all for allowing my students to make a choice about what they will be working on in class, as long as that choice involves learning.  Consistently in my 1st period class (and each other class that I teach), the choice is made to fill in some unknown quantities for some triangles that nobody cares about (not even the teacher that assigned the homework) Why are you assigning homework if it's getting done in AP Stat?  Can't they do that work in your class?

I don't assign homework.  I don't plan on ever assigning homework again.  I plan on assigning worthwhile assignments that students can make the choice to complete at home if they want to make it even more awesome.  I plan on doing things that are valuable in class so that students do not need to work at home.

Imagine if your job gave you a minimal amount of time each day to complete everything for the next day for your 6 different classes and your boss told you, "Just do the rest of it at home."  Oh, wait.....

Well, anyway, back to my 46 minute planning period :)

What defines "failure" in education?

This week is state testing week in our school district.  Watching these kids go through this procedure, I can't help but feel bad for them.  This is what their schooling career comes down to, in the eyes of the people giving out money and deciding whether or not they are a part of a failing school.  Read that last sentence again.  Really?

So you're telling me if the students in my class decide they want to solve the world hunger crisis using what they're learning and what they know, this won't be factored in to the final decision as to whether or not they fail? What if we decide to use our knowledge to assist those in need in Japan?  Just how much of an impact do we need to make before this is acknowledged before test scores?

Those two outcomes get us a nice article in the newspaper, but we could still be considered "failures" without the right test scores.  How is this right?  How does it change?

Sunday, March 6, 2011

A Crazy Idea for Professional Development

We are required to do 14 hours of documented professional development.  I'm sure that this hour requirement is similar at most school districts.  I could do 14 hours of professional development standing on my head.  My estimation is that I spend 5 hours a week reading the professional writings of my colleagues, and another 5 on Twitter engaging my colleagues in professional discourse.  Not to mention the discussions I have with colleagues that I actually work in the same building with. 

When will professional development be more than just an accumulation of hours?  It is a necessity in our classroom to differentiate instruction so that we nurture the interests of our students and provide them with a customized learning experience.  The extent of differentiation that goes into our professional development is the variety of 2-hour workshops in the district catalog to pick from (as long as you pick 7). 

Everybody choosing from the same pool of workshops leads us to a situation in which we have a large number of people with an average skill level.  What better way to generate organizational inefficiency than not nurturing experts? 

So my proposal for professional development: A Portfolio.  A teacher and a principal can have a discussion that begins with "Show me what you've done to become a better educator."  What follows is a look at the portfolio, talk about what's been learned, talk about interest level, what was good, what wasn't so good.  The principal can provide some feedback and suggestions as to where to go next.  Maybe this conversation is a little bit better than putting a check mark next to the teacher that has successfully completed the 14 hours?


A district would still offer the same catalog, but employees would not be limited to taking just those courses.  You can expand your offerings as need be, when you find there are a number of educators looking for something that just isn't offered.  The district catalog becomes customized for the individuals that need it, not the other way around. 



I'm not even going to mention how happy this proposal would make the educational outsider that thinks teachers have no incentive to improve when they get tenure.  Okay, I mentioned it.

Thursday, March 3, 2011

Challenges of Project Based Learning (PBL) in AP Stat

Doing project based learning (PBL) in AP Stat this year has been a challenge. Ultimately, the best part about it is the learning experiences and opportunities it provides students. Every time a project is completed I think of 300 things that could be changed to make it, excuse me for using the phrase "100% better" (kind of an inside joke).  Below I've listed my major concerns about these projects, and the solutions I'm considering.  Your input on any of these is greatly appreciated.

Concerns and Solutions
1. The project was not rigorous enough. It covered too many skills too broadly, or too few skills in unnecessary depth.
     I want to make sure I create a sample project for each project to see just how in depth the projects go. I'm guilty of doing a bare-bones project example (okay, sometimes not even one at all).  There, I said it, I don't always do the project I assign.  The reason for this is to learn alongside of my students.

2. Some kids put a lot of work into a project that just doesn't really address much content, does so incorrectly, or doesn't really get into depth.
     Do it over. It's worth the learning experience of starting from scratch and completing the project again. 

3. What are they actually learning and can they replicate it
     Most of the time a project will involve them learning a new piece of technology as well as learning an AP Stat concept in greater detail.  I'm not sure these projects translate very well to getting an answer correct on the AP exam. Honestly, I want my projects to be far removed from getting right answers on an AP exam.


The Stat Project Process
1. Skills Organization - lay out the content related skills you will be addressing in your project
     Example: conditions for using the binomial/geometric probability distributions, calculating probability for each distribution, determining expected value and standard deviation for a probability distribution

2. Place context on each skill - group brainstorming to see what context fits each skill the best
     Example: highlight the difference between the two probability distributions by filming students walking downthehallway until we observe one of them wearing earbuds (geometric). Compare with a binomial distribution, showing 10 kids walking down the hall, 5 of which are wearing earbuds. 
 
3. Storyboard/Product: what multimedia can we put together, how does it flow, how does everything fit together?  
     Here's where students choose a tool that meets their project's needs.  

4. Edits - does anything need to be rethought or redone as something better?  
     High school students seem to miss this step in almost everything they do, once the "be done" mentality takes over. Sometimes the "do it over" option is the best learning experience.  I've found I've spent more time suggesting they edit and critique their own work and each others' work, and it's made a world of difference in overall quality of product and understanding of statistics.  

Since I don't believe in giving deadlines for learning, when a student asks when their projects are due, I tell them that they may turn them in whenever this process is completed.  With most projects I honestly don't think this process is ever completed.

Friday, February 25, 2011

No Decorations in my Classroom

I don't decorate my classroom. I've referred to it as "hospital" sterile when asked what my room looks like.  The only decorations I'd like to have in my room are signs that say: "LEARN FROM THE PERSON NEXT TO YOU".  My philosophy on this is that it shouldn't matter what's on the walls, let's start caring about what's happening inside the walls.  If it's really cool, then kids will start hanging stuff on the walls and decorating the room the way they want to.

A  few years back, I had a student that would draw pictures to hang on the wall, half of them were about AP Stat. All of them were entertaining. The ones that were about stat were amazingly good depictions of AP Stat topics.  Stay tuned for me to scan these images in and post the best of them.

I think we're in trouble if we concern ourselves too much with what the content looks like, rather imthan what the content actually is. Make that double if we use technology to change only what the content looks like.  When I choose a technology tool, I want tools that make students think deeper first, and ooh and ahh later.  This is probably why I choose Wikispaces, Google Docs, then Microsoft Excel every single time.

I recently responded to a tweet from @nwhyluckysgirl regarding using emoticons when commenting on student work electronically.  
@nwhyluckysgirl: "when commenting on student work electronically, do you use emoticons?" 
@jasonchri: "the occasional emoticon, not too often.  The comment usually is seen only for the emoticon"

My fellow tech integrator and I often debate look and feel, mostly on the look and feel of our district's technology wiki, NP Tech Tools. I keep my class wiki as the default background and format, and I honestly don't care to spend time choosing the right template/background/picture. The content of my class wiki should be the focus, and for that matter I would hope that's what's interesting about it. If you're reading this blog,  you'll see that I take a similar approach to blog layout (and have not yet switched to word press).  How are  you going to see the content if there's a thousand other things to look at on the page? 

I was inspired to write this after reading a brilliant post(Coloring Books or Canvasses? from Spencer's Scratch Pad about technology that makes students think deeply about a subject. Use tech to make students think more, not think about something else, not comment on how weird or cool something looks.  

Making the choice of what tech tool you want to use needs to be content driven.  I wind up picking one of the same three tools (Wikis, Google Docs, Microsoft Excel), since experience has shown these tools can stimulate conversation and let students think the way that they want to.

Some of my tricks...
1.  An Excel spreadsheet that is completely protected, so that they can only manipulate and change certain values to notice some patterns.  They'll need to think their way through certain processes too, not simply plug-in numbers and tell Excel to perform a calculation. 
2.  A massively shared Google document to write about the difference between two(or more) topics.  I'm trying this to get an entire class to discuss the difference between binomial and geometric distributions. 
3.  Class wiki- set of skills completely blank, so determine just how to organize lessons and what we've thought about

I'm branching out...
Forever I've wanted to use more and more multimedia ideas in my class.  Not just make a music video (a lot of times kids spend a ton of time on creating a video, and frequently miss the boat on content).  I want students to show me everything they know about probability, but do so in a very short video (30 sec - 1 min).
I'm going to suggest Animoto, but I really want to allow them to pick any tool that they should find, as long as it communicates everything they know about probability.  I also want to have them use pictures that they've taken, just to get them thinking a little bit more.  

I want to come up with the best web tools/tech tools for education, and I think this is how I want to start. Which tools make students think more, and think deeper? Which ones accomplish the same thing as a "solve for x" worksheet?  Do certain tools get misused?  Are there some tools that are all bark and no bite?

Sunday, February 20, 2011

Know What's Wrong With Kids These Days...

Absolutely nothing. 

A 16 year old will never rarely act like a 30 year old.  Especially when surrounded with 1000 other 16 year-olds.  

They're teenagers, so why do we expect that they'd be studious individuals that devote 4 hours per night to studying for our tests (I've stopped giving tests to eliminate this ridiculous expectation). Did any if us actually spend hours upon hours of studying?  In high school? In college even?  Also, does anyone actually sit and show students how to study/prepare for one of our tests?  Then why do we expect them to be studying experts? 
 
When a student gets a 40 percent on a test, the first conclusion is that they didn't study hard enough.  The second conclusion is that they haven't been working hard enough.  These are easy conclusions to make, and maybe that's why we jump to them so quickly. Blaming the student for acting like a teenager is much easier than being responsible for making that student better.

It's hard to say that a student hasn't learned the material completely and they may need to spend some more time learning it.  So many would think that when we say this, it implies that we have "failed" as teachers.  You know, 'cause it's easy to provide instruction that leads to 100 percent mastery for all of the unique learners in our room .  We have to expect that not everyone is going to learn things the same way in the same amount of time.  This teaching game would be way too easy if there was a fail-safe method to get all students to the exact same level of mastery of a skill, in the exact same amount of time.  When we think we've failed or are failing, that's when we tend to blame others.

Bottom line: blaming is easy, much easier than accepting a portion of the responsibility for something. When we blame kids for the problems in our classroom, we're taking the easy way out. When we blame teachers for the problems in our classroom, we're taking the easy way out. When we blame parents for the problems in our classroom, we take the easy way out. When we blame politicians for the problems in our classroom, we're taking the easy way out.  When we blame administrators for the problems in our classroom, we're taking the easy way out.  When we blame lack of funding for the problems in our classroom, we're taking the easy way out.  When we blame unions for the problems in our classroom, we're taking the easy way out.  Throwing each other under the bus only results in having all the people we need help from being under a bus, and leaves us standing alone on the sidelines.

The calls for accountability really need to stop, so we can start acknowledging that we all share the responsibility for students learning. Real reform will occur once we start thinking about how students learn, and agreeing to collectively share responsibility for educating our students.  Pseudo-reform is going on when we start playing the blame game.  

Wednesday, February 16, 2011

Today's AP Stat Lesson: EXCEL HEAVY - VLOOKUP(RANDBETWEEN(NERD, GEEK), STATGEEKS,2)

Today's(tomorrow's) plan for AP Statistics is a little Excel heavy,  something that I hope carries over for my students into college and beyond. Is there a standardized test that measures a student's increased proficiency at Microsoft Excel, or other computer apps for that matter?  Most commands involve looking up a value at random between zero and one hundred.

Each student has been keeping track of the number of sheets of paper received in each class, each day, over the span of about 2 months. The focus is on teacher created paper, so if a student uses a piece of their own notebook paper it doesn't count.  Incidentally, om pretty sure that AP Stat is dead last for every one of my students major subjects (AP Stat instructor pats himself on back).

Using this info, they are going to create a probability model for the number of sheets of paper received for a single class.

For AP Stat...
SHEETS OF PAPER    0     1      2 
PROBABILITY         0.90 0.05 0.05  

Now that the probability model is created the Excel fun can begin.  Number all cells in one column from 1-100 to represent a possible outcome. Place each possible outcome in the second column according to it's the probability you observed. Example: in the chart above, the probability of receiving 0 sheets of paper was 0.90, so spaces 1-90 would be 0. The probability of receiving 1 was 0.05, so 91-95 would be 1, and 96-100 would be 2.  Repeat for all other possible outcomes. Suggest to students that they choose a class that is at least manageable as far as different numbers of sheets of paper.  Yes, there is a way to make Excel do this, but that is a little too awesome for an AP Stat class.  

Create a new sheet for simulating new days of each class. For the first day, we are going to "lookup" random values from the previous sheet to see how many sheets of paper we will receive. The command for doing this... VLOOKUP(RANDINT(1,100),in the previous sheet, give the value in the second column in the row that the random number is in). 
Example:  =VLOOKUP(RANDBETWEEN(1,100),Sheet2!A$1:B$100,2)

Keep the dollar signs so that when you drag to autofill the formula they continue to look within the same array of cells. Autofill about 200 or 300 of these cells...or 500 :)

Next we're going to calculate the average number of sheets of paper we've received each day. AVERAGE(cell to the left and all cells above). 
Example: =AVERAGE(B2:B6)

Now we can look at the average over the long run (1000 days, or even more) and begin to build our definition of Expected Value.  Then, later, we'll do expected value the easy way in Microsoft Excel using the formula.  I wanted them to wrap their brains around the definition of Expected Value before they began using the formula: E(X) = SUM[X*p(X)].  It's pretty cool to see just how long of a run you need to make the simulation average approach the expected value.

The lesson is a bit like a cooking show, but it's the first time we are in Excel. The kids use their own data, so that's enjoyable/unique for most of them. The bigger idea of simulating outcomes is very powerful in learning statistics. I love nothing more than a student centered approach, but I would hate to say "discover how to use vlookup and randint functions in Excel. If anyone has a way to take a constructivist approach to learning Microsoft Excel, I'd love to hear it.


Thursday, February 10, 2011

My Test with No Right Answers

At the slightly more than halfway point in the year, I decided to give my students an assessment on skills we would have learned previously this year.  In an AP Stat class, this type of exercise is meant to take the place of the 2 weeks of "review" that most educators wind up with at the end of the year before the AP Exam.

Here's the test: AP Statistics Mid-Year Assessment .  Using this link (and excusing the poor formatting that comes with inserting images into Google docs) they are to create a Google Doc and share it with me that is essentially their stream of consciousness  statistical reasoning.  They were permitted to use the class wiki, their Stat textbook (BVD - Stats: Modeling the World), and any other online resources available to them.  Big thanks to David Wees (@davidwees) for the data, graph, and the article!

They were not permitted to use each other as a resource, as this was an assessment to see what they know as individuals.  This makes me uncomfortable, since it doesn't necessary follow the cooperative model I've we've built the classroom on.  Next time I give an assessment, I'll allow them to use each other as a resource as well, but the task needs to be uber-authentic.  Suggestions?
I'll share a conversation I had with a student about this, since it made my day.

"Mr. C, can we use each other as a resource?" - Student
"I'd prefer not, since I want to see what you know." - Me
"But in 'real life' we'd be allowed to use each other." - Student
"Arrggghhh, why must you make that argument?" - Me
"Because you taught me to." - Student

First thing a Stat teacher, or anyone familiar with Statistics will notice is that for most of the questions there's not really one "right" answer.  The purpose behind this is to allow kids to think and write what they know, in regard to the skills being assessed.  I'm loving the thought that has gone into each response.  It feels like I've really touched on something here, something that does a little bit better than the strategy of "write everything you know about the topic" for the poor test taker/assessment do-er.

So what resource did kids use the most?  At quick glance, my class wiki comes in first, with the Stat textbook coming in second.  I'm hoping this spurs more content creation on the class wiki, so that it can develop into the ultimate online stat textbook.  Yes, I am still living that dream.

Tuesday, February 8, 2011

What to do When the Projector Bulb Breaks? Don't Use It...

I float between 3 different classrooms.  All of them equipped in pretty much the same way (laptops, Promethean board, etc).  Upon arrival at my last class of the day, the teacher who is primarily in that room asked me, "How are your punting skills?"  then informed me that the projector bulb blew right in the middle of one of his Calculus lessons.

Fortunately for me on this day, my students did not need to use the board (please note the use of "my students" instead of "I" in this sentence).  So what happens?  I made a few adjustments here and there, and I have one of the best AP Stat lessons I've ever had.

The Plan
Students were assigned three "drawing" tasks, of which they could use any drawing utility they wanted to use (Paint, a Drawing in Google Docs, old fashioned pencil and paper). 
1.  Construct a tree diagram labeled with proper notation to distinguish between P(A), P(A|B), and P(A and B)

2.  Construct a tree diagram for a two-card Texas Hold 'Em hand and for two rolls of a die (illustrate the difference between independent and non-independent events)
3.  Construct 2 Venn Diagrams for drawing one card and the event that it's an Ace or a Heart and also for the event that it's an Ace or a Ten (difference between disjoint and non-disjoint events)

What Happened
I found myself sitting with each individual student, having a conversation about at least one of the topics mentioned above.  None of these conversations began with a student asking "Did I do this right?"  Most of the conversations led to one or more of the skills I had targeted with my original plan.  The students I wasn't actively talking to were talking with others, having discussions about the likelihood of getting two aces in a hand.  Another pair discussed the difference between independence and non-independence through other examples than the ones they created a tree from.

Where I'm going with this
Gradebook revamp...again...slightly.  I've been wondering how I can give more feedback and guide instruction.  After a few classes of these one-to-one(or 3) conversations, I'd like to have an accurate reflection of where each student stands with regard to the skill set we're currently learning.  I see this as a continuously updated Google doc to reflect upon conversations I've had, and students' demonstration of mastery during class.

After I put together my assessment of a student's body of work, I want to share it with them, and place something in my gradebook after I see how it fits in the scale of 1(minimal) to 5(advanced).  I've decided that this has to be done not just after an assessment, but after a consideration of a student's entire body of work.  I don't want to give a test, and have that be the lone summary of what a kid can do.  That's the old way.  I want to consider an entire body of work (tests, quizzes, projects, formative assessment, etc) then place a numeric score in my gradebook.  Yes, the last part of that sentence makes me just sigh and agree that I'm going along with the establishment, at least just a little bit.

Resources
My Probability Skills List (SBG)
Rethinking Assessment (Spencer's Scratch Pad)
What is "Bad" Teaching (Steve J. Moore)
Slideshow of the Lesson Itself

Tuesday, January 25, 2011

"AP Kids" are not the only ones who need quality learning experiences

After hitting the Google Reader hard while at Subway the other day (probably should've been eating lunch and relaxing instead of working) I was inspired by @InnovativeEdu's post on asking kids to design their own learning. Having just switched to the project-based learning format for my classroom, I love to share what my students are doing at these professional development sessions we conduct.

The question that always comes up is "Yeah, but what level do you teach?"  When my response is AP Statistics, it's immediately dismissed since "they are AP kids."  So what if I said, "I had this great lesson where I stood and lectured for 46 minutes with zero audience participation!"  I can almost guarantee to have the same response, "Well, your lecture worked so well because they're AP kids, no way that would work with my 4.0's!"

As an educator in a professional development workshop, why not spend that time to think of ways to reach the kids that are not "AP kids".  It seems like that would be a better use of time than to confirm your suspicions that there just isn't anything that works to educate those that are not taking Advanced Placement courses.

The students in 4.0 classes are the ones that have been most vocal about being not interested in what you have to say.  They are students that are completely unwilling/unmotivated to work unless it interests them.  Know what is especially uninteresting...x's, y's, and slopes of lines.  But these lower level courses cover basic equation solving and "find the slope" the exact same way, over and over again from the time the student is in 9th grade until 12th grade.

Great lessons, quality education, and interesting projects shouldn't be reserved for the best and brightest students.  The 4.0 students don't need more lectures and more basic junk that they don't care about.  They need to be interested, first and foremost.  They don't need more discipline or a rigid classroom structure.  They've told you 100 times that they hate that environment, so stop imposing it on them.

When I see my "AP kids" work on a project that they're excited about (sampling teachers in the school to see if they have tattoos, experiment on whether or not people can walk and text, see how often radio stations repeat certain songs) they aren't excited about it because they're "AP kids".  They aren't excited about it because I threatened them with detention if they showed a lack of enthusiasm.  They're excited about it because they had the choice in what they wanted to do.  They're students, there is no way they are so drastically different than their peers that just so happened to not do well in one math class so they were forced to slide down the ladder and be stuck in "4.0 world".

Oh, and they learn way more from me doing their own projects than they ever could from answering some multiple choice and some free response questions for me.

Wednesday, January 12, 2011

Reading through state standards(PA), going "mental"

I am spending my snow day planning for a presentation on technology integrated K-6 math instruction.  The focus of my session is on generating student inquiry and keeping technology completely transparent.  I am keeping this presentation aligned to state standards and vision, and doing so makes me very uncomfortable.  These standards/essential questions/competencies seem to all center around being able to generate a correct answer for a test.  This elementary math curriculum framework can be found at the Pennsylvania Standards Aligned System Website.  Here's a few phrases I don't particularly care for:

Taken from the 2nd Grade Mathematics list of Big Ideas and Essential Questions:
1.  "How do we know when it is appropriate to estimate or when it is appropriate to use mental math for an exact answer?"
The person that included the phrase "mental math" is clueless in the area of mathematics.  Estimation and approximation are (in my view) much more "mental" than development of an "exact" answer, yet they are projected here as completely non-cerebral tasks.

2.  "Develop extended understanding of multiple models, and properties of addition and subtraction, leading to fluency with efficient, accurate and generalizable methods to add and subtract multi-digit whole numbers and develop quick recall of addition and related subtraction facts. Select and apply appropriate methods to estimate sums and differences or to calculate them mentally."
Again, here they reference calculation as being done "mentally" and estimation as something that's done as an alternative to thinking.  I also don't like the use of the words "efficient" and "quick recall" as they imply that the student that adds two numbers in 10 seconds is somehow better than a student that adds two numbers in 10 minutes.

As we move towards common core standards and the like, is this the language that is to be used?  If so, mathematics instruction will never be more than an instruction of process.  A focus on efficiency over a focus on an understanding of mathematics keeps us at this procedural level.  Maintaining that estimation is done non-mentally, now we're completely missing the boat.  Right answers are not the most important part of learning mathematics.  Isn't it time we start asking our students to experiment and create in their math classes, instead of simply generate the same right answer that 25 other classmates generated?