|Who's ready to drink the Kool-Aid with me?|
Having said all that, I want to point out something that should be really obvious - there's absolutely no reason an auto mechanic must learn the Nevada Constitution.
|It's at times like these that I remember that Nevada's Constitution requires the state to foster institutions for the insane, blind, deaf and dumb.|
|Ears and, well, shucks, maybe a kernel of corn or two as well.|
It's all about... speed.
Remember Summer Break? And Winter Break? And Spring Break? Remember how, when you were in college or high school, you spent about half the year actually in class while, the other half of the year, you partied with your friends, or played video games, or read Emily Dickenson poetry or something? Well, for-profit schools have breaks, too, only instead of 180 days worth of them each year, we take more like 28.
Okay, I exaggerate slightly. We also get snow days, at least where I live, plus probably another week or so if you add up the days we get off for three day weekends like Labor Day, Memorial Day, and so on, but the point is, once you're in school, you don't leave until you're done. Consequently, the longest program where I work is an 18 month program which, at its conclusion, gives you an Associates Degree. Remember, at a public community college, it would normally take you at least two years to get one. That's six extra months of earning a higher wage after graduation right there.
In order to make that possible, though, there's a catch...
Academic independence? What's that?
Instructors don't write their course curriculum where I work. They don't write their homework assignments. They don't write their syllabuses. They don't pick their textbooks. Instead, this is all the responsibility of their department chairs, who, in turn, are accountable to the Academic Dean and must submit any and all changes in their classes to the Dean before they're allowed to implement them in class.
Any professors reading this post right now just felt the hair on the back of their necks stand on end and started muttering profanities in Coptic Sumerian.
Thing is, this is an absolute necessity for two reasons - first, we don't employ professors. Our instructors are people that used to work in the field. They know how to do one thing - whatever job it is they're training students how to do. What they don't know, at least not right off the bat, is how to write homework that makes sense, how to write fair tests, how to set grading criteria, how to write up a syllabus, how to pick a textbook, or any of that. Consequently, all of the academic paperwork has to be taken off their hands so they can do what they do best, which is show someone else how to do what they know how to do. Another advantage of this approach is that, if an instructor decides they've had enough and it's time to go back to the field - and that does happen - another qualified instructor can pick up right where the syllabus left off without drastically changing what's taught in the class.
It's not all bad for instructors, though...
Adjuncts? What are those?
You know how, in most universities these days, graduate students and adjunct professors teach undergraduate classes in exchange for just enough to buy some ramen and boil some water?
We don't do that.
Oh, don't get me wrong, we have part-time instructors too, but, for the most part, they're part-time because they already have full-time jobs in the field and don't want to leave. It's not that we're not willing to hire them full-time - in fact, given our rather aggressive schedule, we usually prefer instructors we can hire full-time because they're much more flexible. However, there are certain topics we teach that make far more money in the field than any college can afford to pay without effectively doubling or tripling tuition. Since it seems a little unfair to charge students who make less than $15/hour enough in tuition to pay an instructor the low- to mid-six figures they're capable of earning for the same skill set in the private sector, we take what we can get when and where we can get it.
Oh, and graduate students? Students don't teach students at our school. Period, end of discussion. For what we charge in tuition, students deserve to be taught by people that actually know what they're talking about.
Yes, flexibility. A lot of our students have day jobs. What does that means for us? It means we teach at night. Until 10:30. Some students, however, work in the afternoons, so we also teach in the mornings, starting at 8:00. Some instructors end up working split shifts, teaching a class for a few hours in the early morning, then heading home for a nap and coming back in the evening to teach the same class to a different group of students. We try to avoid that, but it happens from time to time. We're even looking into weekend courses.
Student schedules must be a mess, then.
Quite the contrary. Students pick a block upon admission - morning, afternoon, or evening. Each block is filled with classes and takes about 4.5 hours to complete from start to finish. Assuming they pass their classes, don't take a leave of absence, or request a block change, they'll stay in that block from start to finish and have their schedule automatically chosen for them all the way through. No fuss, no drama, no "I hope I can get into that class," none of that.
If they fail a class and have to retake it, things start to get a little complicated, but only a little - it happens enough where we're prepared for it. At worst, they'll come back in a phase or two when they can hop back on to their original block schedule.
Oh, right - you know those "semesters", "quarters", or whatever that traditional schools have? Yeah, they don't exist where I work. We have six week phases. You start class, you do your homework, you sit in lectures, you participate in some labs, and six weeks later you take your final. Then, the following week, you start another phase with another set of classes.
No rest for the weary.
Didn't you say you were going to talk about what it's like to work there?
Indeed I did, and this is as good of time as any to bring up a rather interesting point - because of the constant, grueling schedule, working as an IT Manager here is more like working in IT at a casino or a retail establishment than a school. Maintenance is done in very tight windows that assiduously avoids any downtime for students or instructors as much as possible. Scripting is used religiously - if something can be done repeatedly, it can be done automatically at 0300 on a Sunday. Network upgrades are done very carefully, usually during one of the four weeks a year that we're not holding classes. Even then, though, the building is open and staff is working.
Coincidentally, as awkward as this is for IT, remember that instructors and department heads are operating under the same constraints. There's no time to completely overhaul a class between one phase and the next, so all class changes and improvements must be done iteratively. If you'd like, you can think of each phase as a six week scrum.
Back up a second. If the people that are teaching the students aren't in charge of what's in the curriculum, how do you know the students are learning the right material?
Oh, that's easy - we ask employers. In fact, even department chairs don't get to make arbitrary changes to the curriculum without asking a committee of employers about the possible change or without soliciting input from employers about other changes they'd rather see instead.
A good example of the difference between our approach and a university would be the use of Linux in our curriculum. At UNR, Linux is all over the place, especially in the Computer Science program, which makes sense - it costs nothing to install and the code is open source, which makes it really easy to show future programmers what's going on, where, and why. Where I work, however, our curriculum is determined by our students' future employers, and, truth is, at least in Reno, more Linux in our curriculum would be nice, but not as nice as what we already have in there. Thing is, employers think it would be "nice" if our students knew more about Linux; on the other hand, employers think it's "imperative" that they know something about Cisco and Windows. Consequently, when we teach Linux, we have to be kind of sneaky about it - if it doesn't fit in a class we already teach that has already been approved by the employers, we can't use it.
It's not that we don't want to teach Linux - it's just that the people that hire our students would rather they learn just about anything else instead first. Believe me, I asked.
So, how do you pick which programs to teach?
This is where I'll pause and point out I'm an IT Manager, not an Academic Dean, so my understanding of the details is square in the middle of what I like to affectionately think of as the "Dunning-Kruger Zone". However, from what I've been able to piece together, it looks something like this:
- What skills are employers looking for that they're not getting enough of from the community right now?
- Can we find people in that field that are willing to teach these skills?
- Are students willing to learn those skills?
The last point is actually a pretty big issue. Our best paying and highest placing programs after graduation are also frequently our least popular. Why? Because a lot of students think those programs look too hard for them. Maybe the program takes to long, maybe the program promises to teach a set of skills that the prospective student can't even conceptualize. Which actually brings something else up...
We have to start from scratch.
|So, what, like "not fast food?"|
|Oh, okay! Scratch!|
Still not scratchy enough.
A couple of months back, I had to take over an Introduction to Computers course at work. The class itself is pretty straightforward - the first half of it focuses on typing exercises, with the latter half focusing on using Microsoft Word to write a basic letter. Based on my previous IT consulting experience, I knew that there was a better-than-even chance that I might go too fast, so I was encouraging them to raise their hands and stop me if that happened. It's not because the class was stupid or I was so smart - it's just that, well, my father's a programmer who got the family a computer when I was three, which was over thirty years ago. Long story short, I've been eating, drinking, breathing, and sleeping computers almost since birth while I was partially raised by someone who's been making a living doing the same for longer than I was alive. When you're exposed to something that long and that early, there are a lot of assumptions you take for granted, like "double-clicking on something opens it", "Delete and Backspace do different things", "the left mouse button does something different from the right", and "you really don't need a mouse to use a computer", among other things. Trouble is - and I knew this from working in the field - most people haven't spent literally decades internalizing these lessons.
Don't get what I mean? Perhaps this will help:
Bear in mind that the computer they were struggling with is not only the same computer I had at home (more or less - ours was an Apple ][e, not the Apple ][+ shown in the video), it was also the same computer that I and my classmates saw in every single computer lab from 1985 until the early '90s. Seriously, those things were the Volkswagen Beetles of the early educational computing world. It's hard to overstate how ubiquitous those things were in schools. If you put one in front of me right now and told me to use it, I'd be halfway to finishing this blog post on it in no time. The kids in the video, however, never had to deal with a computer like that, so all the things I took for granted growing up - flipping a disk over, inserting a disk in general, typing simple commands at a prompt to make something happen (yes, I remember PR#6) - are completely lost on them. They never had to deal with it.
Okay, maybe that's not a fair example. How about a Sony Walkman? That's easy enough, right?
With a little bit of trial and effort, they eventually figure it out... sort of. Again, though, it's easy to forget when you were raised around these things and everyone had one that there's really nothing particularly intuitive or obvious about a cassette tape player if you've never held a cassette in your hand in your entire life.
Now, pretend those weren't naturally inquisitive kids figuring out a Walkman but were instead adults - many of them somewhat older - trying to figure out a computer for the first time.
Wait, what's that? The first time?
How is it possible for any American to use a computer for the first time in the 21st century, you ask? Does Northern Nevada have a surprisingly large Amish population or something? No, and that's not the crazy part. Most people in the class I was teaching that hadn't meaningfully used a computer before were younger than me. A lot of them were students in their early 20's. Maybe a little younger. The older students had at least used a computer or two at work at some point in their lives.
But, what about Computer Literacy courses? Computer labs in elementary schools? Did their parents sign permission slips exempting them along with the sex ed waivers or something?
No, but here's the thing - and, if you're the type of person to read blogs, this is going to be near-impossible to believe - a lot of people, and I mean a lot of people, don't actually have computers at home. Even now. Or, if they do have a computer at home, they rarely power it on and never, ever let their children touch it. After all, if you're a financially struggling parent with minimal computer literacy, spending a few hundred dollars on anything is a stretch, much less something you barely understand how to use. Add in a few moral panics about social media and viruses and you'll be firmly convinced that, if your child touches your computer, they'll break your expensive computer on the way out the door and get kidnapped by a sexual predator they met online on the way to school. Naturally, after being told by their parents that "computers are forbidden" over and over again, when children actually get a chance to touch one at school, they'll either be too afraid to do anything or, just as likely, they'll just ignore it at school just like they learned to at home.
So, what we have to do where I work is not only teach people that have never had a real chance to use a computer before how to use a computer - all jobs these days are requiring them in some capacity or another - we also have to overcome that initial fear of computers that has been purposefully instilled in them their entire lives. Adding insult to injury, we have exactly six weeks to somehow accomplish this educational and psychological breakthrough; if we don't succeed, we have to fail them and they'll have to retake the course, only next time they'll be carrying the added internal shame of failing one of their first courses. Not surprisingly, that really doesn't help.
This, coincidentally, comes back to why some of our best paying programs are also our least popular - the ones that pay the most use computers extensively, either to keep networks running, or manipulate robots, or something else entirely. The most popular programs are the ones that use computers in their curriculum as little as humanly possible.
This looks like as good of a stopping point as any - next time, I'll discuss what happens to the students when they're done and some of the things we have to be careful about along the way, along with a little more detail about the experience of actually working at a for-profit college.
"20100609 - Unstirred Kool-Aid" by Rob is licensed under CC BY-SA 2.0
"Mechanic" by : : w i n t e r t w i n e d : : is licensed under CC BY 2.0
"Corn is in" by Dwight Sipler is licensed under CC BY 2.0
"Hamburger Helper, 1994" by Roadsidepictures is licensed under CC BY-NC-ND 2.0
"Fresh produce at the Byward Market" by Jamie McCaffrey is licensed under CC BY 2.0
"a pretty morning farm" by scott1346 is licensed under CC BY 2.0
"Sunrise Desért Maroc" by Grand Parc is licensed under CC BY 2.0