I think my students and I finally got some things right this semester. Too bad it only took twenty-five years, but I’ll take it anyway. Rather than wallow in self pity, I’ll just get right to it.

My students and I have finally mainstreamed special relativity as the starting point in introductory calculus-based physics. There are no equations, only significant conceptual understanding that, when you get right down to it, is far simpler than almost all students expect it both could and should be. Sometimes we get as far as the Lorentz transformation, but usually we don’t and that’s fine. As long as students understand that all…ALL…of the perceived “weirdness” of special relativity comes from the realization that light’s speed is invariant, we’re good. This leads directly to both time dilation and length contraction. Understanding that nothing “really” happens to moving clocks or moving rods is far more valuable than manipulating countless equations.

My students and I have deprecated pencil and paper in favor of LaTeX. We still use whiteboards, but for anything that is to be turned in the expectation is now that it will be done using LaTeX. The Overleaf environment negates the need for a local TeX/LaTeX installagion and works on every device with which I’ve tested it and allows students to build a library of problems and solutions. Of course they also use my mandi package, which was designed specifically for student use in mind. Note that I almost always use a version that is more recent than that on CTAN. Anyway, at the end of the semester students leave with a folder/portfolio of problems and solutions they’ve written. That’s something tangible I never really had as a student.

My students and I have deprecated traditional tests, quizzes, and such in favor of an approach I rather publicly stole from a Caltech course taught by Kip Thorne. The idea is to let students choose what to do in order to demonstrate learning and understanding. I mean, I’m continually told that “ALL physics students will ultimately be judged by their ability, or lack thereof, to work textbook problems” (note that I don’t necessarily agree with this but it is indeed a strong status quo opinion) so why not just eliminate the traditional tests and get down to the nitty gritty. I present a list of problems from each chapter, and not trivial problems either, from which students choose the ones they feel most accurately convey and demonstrate their achievement. I’m not entirely happy with this in that it implicitly assumes that the textbook problems are the ultimate endpoint, and they are most certainly not. Therefore, next time I will experiment further directing students to my list of problems and questions on this blog that, hopefully, go deeper than many textbook problems.

My students and I have almost deprecated handheld calculators in favor of Python. My ultimate goal is to completely depricate them and have ALL calculation done with Python/VPython scripts. That way, students can build a library of scripts for various purposes. However, writing scripts requires learning Python and students are hesitant to dig in at first and frequently end up resorting to calculators. I must try harder to reinforce the utility of Python in subsequent semesters over and above computational problems from the textbook.

In my opinion, I still lack the ability to sufficiently motivate students to engage in their education outside of the classroom environment I strive to create for them. However, just this morning I read the blog entry in this tweet and began reflecting on my own practices. Maybe I should loosen up even more (I run a comparitively relaxed classroom as it is) and just let students be the people they are and see what happens. It’s something I need to think about.

I also want to experiment with having students present problems and solutions to the class as part of their assessments. This seems like an efficient way to implement oral examination into the course without it being a logistical nightmare.

As always, feedback is welcomed.