Friday, May 4, 2012

Capstone classes and "real" engineering

I'll attempt a bit of subterfuge and obfuscation here so I don't inadvertently ruffle too many feathers...

A friend just sent me a summary of the EE capstone projects from this year's crop of students at a university that shall go unnamed. Bright-eyed young engineers, fun problems, innovative solutions, yadda-yadda-yadda. Pretty much anyone reading this has a pretty good mental image of what might have been in the e-mail, and many of you LIVED it. But then, right at the end, was an incredibly troubling BTW:

PS: professor's name here told me that he does not allow Arduinos to be used on these projects, since it makes things too easy (at this level of education, they're supposed to be learning the stuff that Arduino hides from you).

Better he was there than me. I'd have lost my shit if an engineering prof said something like that to me.

I get what he's saying- that the journey is the point, not the destination- it's just that for everything else these students will ever do that is dead wrong. And I'll agree with one thing: for a microcontroller class, Arduino is the wrong choice because it does abstract away important concepts. But this isn't a microcontroller course, it's the capstone course, which is as close to an actual "design-a-solution-to-this-problem" assignment as most of these kids will get before they find their way into a workplace.

Giving these kids the (mistaken) idea that solution A (which requires 5 hours to realize because you used an Arduino) is somehow inferior to solution B (which requires three months to realize because you have to select a microcontroller, design, fab, populate, and respin a PCB, write code mostly from scratch, and do all this without a tremendous body of prior work and community support which is directly applicable to the code and hardware you're working on) is only going to cause trouble down the road- trouble for them, for the more experienced engineers they are hired to work with, and trouble for clients whose dates get missed because using solution A feels like "cheating" or "not real engineering".

I was taught that the point of an engineering education is to get a person to learn how to solve problems, not how to use tools (cross reference Isaac Asimov's wonderful short story "Profession"). The point of a capstone class, IMO, is to take the gloves off, put the students in the ring and expect them to beat the problem into submission through any means they see fit.

This idea, that the "easy" solution is wrong, undermines a core tenet of engineering- the "right" way to do something is any way that comes in fully functional, on time, under budget, and which is set up to perpetuate those three things into the future (i.e., is maintainable). I've known plenty of engineers who were so caught up in doing something the "elegant" way that they would never dream of not writing their own floating point library (in assembly!) for their microcontroller of choice even though there's a perfectly good library already written.

Done is beautiful.

Saturday, January 21, 2012

Numeracy fail

Side note- it's been a looong time since I wrote a blog post. Since then, I've had a second child, moved jobs (I'm now working at SparkFun- woot!), states (Colorado is state number six for me), and time zones. A lot of the more technical sort of blog post I'd write now goes up on the SparkFun website as tutorials, but I'm going to try to post more here.

Engineers and scientists (and other general nerd/geek types) like to talk about "numeracy", which is the ability of a person to grok math. It used to be that the primary "complaint" (if you will) was about people who play the lottery- "a tax on being bad at math". I'm not talking about valuation, here- being willing to pay five- to tenfold as much for a meal at a restaurant as it would cost to make at home, for instance. I'm talking about hard numbers- apples to apples.

Having a poor grasp on mathematics is getting to be a bigger handicap, though. I've notice recently when grocery shopping that the old principle of "buy more, save more" no longer applies- frequently, a bigger box of cereal, say, will cost MORE per ounce than a smaller box.

Or, containers will be cleverly redesigned to contain less but look the same, then the price is maintained. Next time you're at the store, check out the "half gallon" containers of orange juice. Many (most?) of them are 59 ounces, instead of 64. They don't look any different, of course, and there's no mention on them that 59 ounces is NOT a half gallon (and I'm not optimistic that many consumers know what a half gallon is, nor would be able to calculate price per ounce between a 59 ounce jug and a 64 ounce jug).

I noticed this BECAUSE I grok math. I do simple mental arithmetic many, many times a day as an electrical engineer- calculating expected currents, expected power consumption, approximate required resistance, etc etc. When I look at a package in the store, I can't help but calculate a price-per-ounce of the contents. It's usually right to within 5 percent or so- I round to make the math easier (metric would make it easier yet- is that why we haven't changed?). But I was noticing that there were discrepancies that could not be papered over by hurried mental math.

Keep an eye open- I've heard from a few others that they've noticed this but I'm curious how widespread this practice is.

One other place I've noticed crap numeracy is in science writing, and this is REALLY disturbing to me. Two variations on the same metric will be given in a book or article- say, the number of children who die of malaria every day and the number of people who die of malaria in a year. These values may be reported multiple times and usually won't be reported in close conjunction with one another (note that I don't think this is an attempt at dissembling by the author- it simply reflects the where and when in the work that each makes sense to be mentioned). The troubling thing is, the numbers reported are frequently mutually exclusive. For the above example, the number of deaths per year due to malaria may be reported as, say, 500,000. Quick mental math says that this implies slightly more than 1,000 people per day (remember, I'm an engineer- pi is "about 3" until I need a better answer). Elsewhere, a value of deaths of children will be given, and that number will be, say, 1,500 per day.

In my mind, alarm bells go off. I didn't calculate exactly how many people per day are claimed to be dying of malaria, nor exactly how many children per year are claimed to be dying, BUT my order-of-magnitude estimate tells me that the author is either not counting children as people or not paying attention to math. Frighteningly enough, neither was the editor nor anyone else who weighed in on the book. This kind of basic mathematical error casts doubts on everything else in the book- after all, someone who isn't capable of that kind of mental comparison is certainly unlikely to be able to judge the veracity of more complicated mathematics behind statistical predictions and observations that form the basis of most proposed solutions.

I'm not sure there's much we can do about this- I grok math because it's very much part of my daily life. I exercise those mental muscles constantly to a point where I apply them unconsciously to situations most people don't even relate to math. Maybe some kind of wide-scale gamification of mathematics? At any rate, I don't think it's something that can be addressed by education. Mathematics is fundamentally a foreign language to the human brain, and the only way to really learn a foreign language is to use it, over and over, until you are fluent.