About Learning Programming
There are often questions in this about what language someone should learn, or how someone should approach studying programming. Every once in a while, someone will even ask about how to approach Computer Science, specifically.
I've been thinking about these questions a little. The issue has been attractive to me for two different reasons, I think. First, there's a shortage of good engineers. Really: engineering graduations are on a decline, and they weren't that high to begin with. This puts industry in a bind, because industry and commerce need software. But they can't get that software without engineers. I'd like to see this situation get better because better, easily obtainable technology means a better future for everyone.
Next, I occasionally talk at career days at schools. (I do this far less often than I wish to, in fact.) I always think about what I can tell young students who might consider an engineering career, or work in the field of mathematics or computer science specifically.
I thought I would write an essay and post it here to share my thoughts. If you'd rather read something else, feel free; you shouldn't have a hard time finding something interesting on the Internet, perhaps without even leaving HardForum. I'm just hoping to help those I see so frequently here, asking for guidance.
The Big Disclaimer
Before I start, however, there's a couple things you might want to know about me. Most importantly, I'm a college dropout and I have a strong disdain for the educational system in America. If students are paying $0.30 each for ScanTron test forms, but the football team flies to Denver for free, well, you've got problems. And the US education system, at all levels, has some serious problems.
Next, I've always known what I wanted to do with my life. I think this puts me in the minority; most of my acquaintances in high school didn't have an clue what they wanted to do while they were graduating. They found a major very late, by my standards, and had little background in the field of their life's work before they started studying it. My decision was easy, and I believe that not everyone has it so easy.
Finally, by no means is this essay intended to be anything like an answer, or give prescriptive advice. Personal situations are too varied, resources aren't ubiquitously available, aptitude and desire are different, and so on. One of the subtle reasons that I'm writing this post is that one of my previous notes was called "my advice" and "my recommendation" when I had gone out of my way to explain that it really wasn't -- it's just what I did myself.
What I Did Myself
The way I learned to design and write software might not work for everyone. I've listed some of the reasons it might not work above, but by far the most important reason is that everyone is most comfortable, and therefore most effective, doing their learning in a different way. (This is one of the biggest reasons I've got a problem with formal education systems: they expect a largely one-size-fits-all approach will work. It might be the best we can do right now, but such a system doesn't work so well. At best, it substantially doesn't suit a number of individuals.)
Experience shows us that people learn best in different ways. Look at your friends: I'm sure you know someone who studies by sitting in front of the television with a book open, cramming at the last minute. And someone else who sits in a quiet and dark room with a bright desk lamp, exactly four sharp pencils, one highlighter, two new pads, and their textbook.
Book recommendations always bring a variety of answers. Why? Because people need different authors to approach material in different ways. I sweated bullets when writing my book about getting the style and scoping right. Even after all those revisions, I'm still not sure it's right. Readers would send me emails saying that they loved my book. Others would write and say it sucked, and that I should write more like so-and-so, instead. As an experiment once, I forwarded such an email to someone who had liked my book; I asked them if they liked that other guy's book, and why. They quickly responded that all the reasons the other reader hated my book and loved the other one were reasons they didn't like so-and-so's book, but loved mine.
The educational process, to my naive and untrained opinion, seems very much hit-or-miss. There must be fundamental things that always work, but teaching adults (or near-adults) complex things doesn't seem to be affected by those fundamental techniques. Otherwise, everyone would get it right and all schools would be equal, wouldn't they?
I learned programming by first studying digital electronics. I used a breadboard and TTL logic chips along with a training manual from Hewlett Packard. Eventually, I found the BugBook series which really got me going. It included little modules: a bank of switches, four LEDs with transistors to drive them, a seven-segment display driven by 7447 decoder, and so on. It was powered by a six-volt lantern battery and a drop-down diode, and later by a 7805-based power supply that my neighbor built.
I was eight before I got my first computer, a single-board machine with a 6502 processor and one kilobyte of memory. It had a great six digit, seven-segment display, that I learned to program. Best of all, it had about 40 lines of digital I/O pins which I could use to control my little digital circuits. There were four timers, if I remember right, though a couple were used by the system itself.
This computer was programmed in machine language; it didn't even have an assembler. I used it for weeks, studying the sample programs and reading programs in magazines. One day, a few paragraphs in a programming book put it all together for me. Maybe the book was by Randal Hyde, or Gary Gygax; I can't remember. But the explanation was very clear and it made me understand what the magical board was really doing in a way that related to all the digital electronics I had been studying.
And that was it. I was off reading everything I could find to teach myself. I moved to a more capable little computer, then up to an Apple II. I was doing digital electronics projects at a furious pace, buying all the hardware I could afford and making little doodads which invariably ended up interfaced in one way or another to whatever computer I had at my disposal. By this time, I was eleven or twelve or so.
By the time I was sixteen, I had mastered the 8086, Z-80, and 6502 assemblers I was using. I had learned Pascal and C, and written countless little projects. Looking back at it, I really didn't learn a lot--I lacked any guidance whatsoever from someone who could have mentored me to learn more or study interesting things that were "just challenging enough". Anything too frustrating would've quashed my interest. Too easy? I would've similarly moved on to something that was more engaging.
When I tried college, I found that the people I was paying to be my mentors weren't really doing such a hot job, either.
What I Learned
If you had asked me what I had learned at that young age, I would've rattled off a list of languages and maybe a few techniques, not unlike the one above. It's taken another decade or two to realize what I truly had learned, not just what I thought I knew.
My perception of computers starts out at the lowest levels. I'm not necessarily comfortable with something unless I am satisfied that I understand how it works. A truly great racing driver isn't going to get very far, no matter how talented he is, unless he understand the car at least a little. He doesn't need to know how to fabricate his own suspension, or do a valve job on an engine. But he should certainly know what's invovled, and what he components do. Otherwise, how can he understand how his inputs to the car will affect the car in different situations? By experience? Sure -- given an very large number of laps, he can probably engage in almost every scenario. But that's not realistic, and it won't be complete. His gut instincts might be wrong if they're not rooted in an education about what the car and its implementation.
If you frequent the storage section of the forum, you might remember my badgering questions about RAID. How did it really work? How could I ever be comfortable writing software for it if I didn't have the foggiest idea of what made it really work? Why is some code slow and some code fast? If I don't understand the processor and the architecture surrounding it, I'll have a hard time optimizing for that same architecture, won't I?
Succinctly, what I learned was that a bottom-up approach rewarded me. I could learn something, then make it intuitive. Then, build on it. Even years after learning a lesson, I might still recall the facts about it and apply them to a new problem.
Do I know everything at the lowest level? Of course not. But what I do know makes it easier to learn more as I go along.
Since everyone is different, I'm sure there are people who'd rather drill-down. (And, maybe, in some situations, I'm one of them. I'd rather cook a burger and eat than learn enough biochemistry to understand why adding peppers or mushrooms makes it taste so much better.) And people in-between, who'd rather study and get a grade and get out of there. Maybe some of my observations, presented in the balance of this essay, will be of interest to those types of learners.
If youre in one of those other categories, maybe you can still learn from the ground up by reading books like Write Great Code: Understanding the Machine, or Code.
Debugging
I've become convinced that one of the most important skills in professional software development is debugging. The series of articles I wrote for the now defunct Visual C++ User's Journal underscores that belief, I think.
When interviewing candidates, I try to ask them at least one debugging question to gauge their ability to think through a problem. It is too easy to bone up on trick questions or memorize a few algorithms. What you do when things go wrong, in whatever discipline you're practicing, shows your true mettle. Do you ask for help? When? Can you at least measure a few things and look for clues? What if you find good clues? Bad ones? Conflicting ones? What if the terrain doesn't match the map?
A great candidate will admit to me that they debug code that already works. To the inexperienced, they'll sound like someone who doesn't know that you shouldn't try to fix something that ain't broke. But how can you fix something that is broken if you don't know what it looks like when it is working?
If you're working on a very low-level computer system, your debugging options aren't always wonderful. You might have tools with some rough edges, or some very critical timing issues. You'll need to debug more with your brain than your keyboard. A good understanding of the way things work is going to be indispensable in those conditions. Your noodle will be your biggest weapon.
I think that debugging extends past stepping through code, particularly when considering the study of code that already works. It means watching that code and seeing what it does. Why might it hit the disk so much? Does it cause page faults? Where? Can they be avoided? Debugging also involves doing code reviews, thinking of symptoms and causes, and reassuring customers.
About Langauges
One of the more common questions here is about which language to learn; even which implementation of which language.
If I interviewed someone who told me they knew language Z, but didn't know language R too well, I wouldn't mind. We could have the interview in language Z. But I'd press them a little bit about what they thought of language R. They should know at least a little about it; what it is good for, what it's not good for, and so on. If they start ranting about how it was invented by an evil scientist who worked at a company that everyone hates, or that it "just sucks", or other such nonsense, I can't imagine the interview would last very long.
A professional developer isn't concerned with language; they're even less concerned with style in a particular language. They're able to pick up a language pretty rapidly, since it's just a different way to express the ideas they should already be very familiar with. As they gain practice, they'll get better and more fluent. But if they spend the first couple of weeks learning different things, that's no big deal.
Your first language, however, is a more interesting question. I stumbled into machine language simply because nothing else was reasonably available at the time. Young programmers these days don't have it so easy.
If I were starting, I'd think about a few attributes for the language I chose to learn. First, would it be something that's commercially acceptable? I don't mean the language that offers the most jobs in the paper; I mean something that has community support, that I can get help with and find books for, and so on. I don't see any reason to start programming with Haskell, or some other lab-rat of a language. C++, Java, Perl, Python, some assembler, whatever; they all are documented and viable.
Next, I'd consider the frustration factor. How much background do I need to learn about the tools in order to get started? Having a mentor at your disposal, or access to some books or courses, can significantly change the way you answer this question. If your buddy can help you download and setup the Ming kit or Visual C++, maybe you should do that. On the other hand, if you're doing it yourself and don't have much background, I can't imagine something more frustrating then looking through a list of gigabytes of zipped, tarred, archives. How would you even know you got it to work?
One of the langauges I think that's doing really well in this area turns out to be Perl. It's interpreted, so you don't have to sit through long, multi-step compiles. You can get immediate results without going through the edit-compile-test loop. Setting up the language isn't hard at all, and there are packaged installers that do the hard work. There's no shortage of books or websites. In fact, there's a book or two that teach algorithms with Perl, so you can study performance and tuning with the language, too.
Python is in the same boat, but it seems support for it is just a little less mature than for Perl. There might be other languages in this camp, but those are the ones I know. If you don't have the gumption to start out with assembler, then perhaps one of them is right for you.
Be wary of people who say that something's "too ugly", or "really stupid". Maybe they're right, but that doesn't concern you right now. Learning does. Make a note of what they said and gain some experience -- then, see if you can figure out why they held that opinion. If you have a little experience under your belt, you might try asking them about their opinion. See how many statements they make which are unsubstantiatable, and how many seem believable to you. (There's no shortage of religious wars about platforms, tools, and languages in this business. Since we don't certify engineers, there's no shortage of true experts.) Did listening to their beliefs teach you anything you felt like you could take away as fact?
Making Progress
Computer programming isn't easy. Many smart people can't do it, or aren't interested in trying. Some smart people learn enough to do what they need. Maybe they're using Fortran, or the Mathematica language.
Programming professionally -- making software that the general public uses, in whatever form -- is actually very hard.
As hard as it might or might not be, not doing it guarantees failure. And not continuing guarantees decay.
I learn Perl about twice a year. I just don't use it enough to remember what I've learned. I've got no problem starting again, and I remember more each time. But I still can't call myself well-versed in the language because I'm always rusty, and because I haven't solved any really big problems using the language.
Imperative to a successful start, then, is a project that's just the right size. Maybe your mentor can help you pick one depending on your abilities and the time and tools you have. Maybe it ranges from finding some small prime numbers to re-writing the Windows CALC program. It's going to be simple, though; if it isn't, you'll get hopelessly lost and frustrated. Your first project is certainly not writing a CAD program, or a chess game, or a MMPORPG, or anything that's going to set the industry on fire. So set your expectations realistically. You're not going to buy some paints at the art store and churn out The Mona Lisa on your second weekend.
What is important for your first project is that it interests you. If you like model rockets, make a program that estimates fuel capacity or predicts where a rocket will land given some parameters about its launch. If you like music, pick something simple: print out all the tags in your MP3 files and sort them, then print them out as text. Think of things that you'd never do by hand, but could do with a computer. Rip through every file on your hard drive, for example, and count them -- compute the average length, find the oldest and newest file, and so on.
If you start getting good, buy a book like "Programming Challenges" and work through it. Or try some of the exercises in an algorithms textbook. Heck, just implement the algorithms the book describes and time them with different data sets.
Don't bother jumping into Microsoft Windows, or the MacOS presentation manager, or X Windows at first. There's plenty of time for that later, and it'll all be easy to understand after you've got some notches on your belt.
There are often questions in this about what language someone should learn, or how someone should approach studying programming. Every once in a while, someone will even ask about how to approach Computer Science, specifically.
I've been thinking about these questions a little. The issue has been attractive to me for two different reasons, I think. First, there's a shortage of good engineers. Really: engineering graduations are on a decline, and they weren't that high to begin with. This puts industry in a bind, because industry and commerce need software. But they can't get that software without engineers. I'd like to see this situation get better because better, easily obtainable technology means a better future for everyone.
Next, I occasionally talk at career days at schools. (I do this far less often than I wish to, in fact.) I always think about what I can tell young students who might consider an engineering career, or work in the field of mathematics or computer science specifically.
I thought I would write an essay and post it here to share my thoughts. If you'd rather read something else, feel free; you shouldn't have a hard time finding something interesting on the Internet, perhaps without even leaving HardForum. I'm just hoping to help those I see so frequently here, asking for guidance.
The Big Disclaimer
Before I start, however, there's a couple things you might want to know about me. Most importantly, I'm a college dropout and I have a strong disdain for the educational system in America. If students are paying $0.30 each for ScanTron test forms, but the football team flies to Denver for free, well, you've got problems. And the US education system, at all levels, has some serious problems.
Next, I've always known what I wanted to do with my life. I think this puts me in the minority; most of my acquaintances in high school didn't have an clue what they wanted to do while they were graduating. They found a major very late, by my standards, and had little background in the field of their life's work before they started studying it. My decision was easy, and I believe that not everyone has it so easy.
Finally, by no means is this essay intended to be anything like an answer, or give prescriptive advice. Personal situations are too varied, resources aren't ubiquitously available, aptitude and desire are different, and so on. One of the subtle reasons that I'm writing this post is that one of my previous notes was called "my advice" and "my recommendation" when I had gone out of my way to explain that it really wasn't -- it's just what I did myself.
What I Did Myself
The way I learned to design and write software might not work for everyone. I've listed some of the reasons it might not work above, but by far the most important reason is that everyone is most comfortable, and therefore most effective, doing their learning in a different way. (This is one of the biggest reasons I've got a problem with formal education systems: they expect a largely one-size-fits-all approach will work. It might be the best we can do right now, but such a system doesn't work so well. At best, it substantially doesn't suit a number of individuals.)
Experience shows us that people learn best in different ways. Look at your friends: I'm sure you know someone who studies by sitting in front of the television with a book open, cramming at the last minute. And someone else who sits in a quiet and dark room with a bright desk lamp, exactly four sharp pencils, one highlighter, two new pads, and their textbook.
Book recommendations always bring a variety of answers. Why? Because people need different authors to approach material in different ways. I sweated bullets when writing my book about getting the style and scoping right. Even after all those revisions, I'm still not sure it's right. Readers would send me emails saying that they loved my book. Others would write and say it sucked, and that I should write more like so-and-so, instead. As an experiment once, I forwarded such an email to someone who had liked my book; I asked them if they liked that other guy's book, and why. They quickly responded that all the reasons the other reader hated my book and loved the other one were reasons they didn't like so-and-so's book, but loved mine.
The educational process, to my naive and untrained opinion, seems very much hit-or-miss. There must be fundamental things that always work, but teaching adults (or near-adults) complex things doesn't seem to be affected by those fundamental techniques. Otherwise, everyone would get it right and all schools would be equal, wouldn't they?
I learned programming by first studying digital electronics. I used a breadboard and TTL logic chips along with a training manual from Hewlett Packard. Eventually, I found the BugBook series which really got me going. It included little modules: a bank of switches, four LEDs with transistors to drive them, a seven-segment display driven by 7447 decoder, and so on. It was powered by a six-volt lantern battery and a drop-down diode, and later by a 7805-based power supply that my neighbor built.
I was eight before I got my first computer, a single-board machine with a 6502 processor and one kilobyte of memory. It had a great six digit, seven-segment display, that I learned to program. Best of all, it had about 40 lines of digital I/O pins which I could use to control my little digital circuits. There were four timers, if I remember right, though a couple were used by the system itself.
This computer was programmed in machine language; it didn't even have an assembler. I used it for weeks, studying the sample programs and reading programs in magazines. One day, a few paragraphs in a programming book put it all together for me. Maybe the book was by Randal Hyde, or Gary Gygax; I can't remember. But the explanation was very clear and it made me understand what the magical board was really doing in a way that related to all the digital electronics I had been studying.
And that was it. I was off reading everything I could find to teach myself. I moved to a more capable little computer, then up to an Apple II. I was doing digital electronics projects at a furious pace, buying all the hardware I could afford and making little doodads which invariably ended up interfaced in one way or another to whatever computer I had at my disposal. By this time, I was eleven or twelve or so.
By the time I was sixteen, I had mastered the 8086, Z-80, and 6502 assemblers I was using. I had learned Pascal and C, and written countless little projects. Looking back at it, I really didn't learn a lot--I lacked any guidance whatsoever from someone who could have mentored me to learn more or study interesting things that were "just challenging enough". Anything too frustrating would've quashed my interest. Too easy? I would've similarly moved on to something that was more engaging.
When I tried college, I found that the people I was paying to be my mentors weren't really doing such a hot job, either.
What I Learned
If you had asked me what I had learned at that young age, I would've rattled off a list of languages and maybe a few techniques, not unlike the one above. It's taken another decade or two to realize what I truly had learned, not just what I thought I knew.
My perception of computers starts out at the lowest levels. I'm not necessarily comfortable with something unless I am satisfied that I understand how it works. A truly great racing driver isn't going to get very far, no matter how talented he is, unless he understand the car at least a little. He doesn't need to know how to fabricate his own suspension, or do a valve job on an engine. But he should certainly know what's invovled, and what he components do. Otherwise, how can he understand how his inputs to the car will affect the car in different situations? By experience? Sure -- given an very large number of laps, he can probably engage in almost every scenario. But that's not realistic, and it won't be complete. His gut instincts might be wrong if they're not rooted in an education about what the car and its implementation.
If you frequent the storage section of the forum, you might remember my badgering questions about RAID. How did it really work? How could I ever be comfortable writing software for it if I didn't have the foggiest idea of what made it really work? Why is some code slow and some code fast? If I don't understand the processor and the architecture surrounding it, I'll have a hard time optimizing for that same architecture, won't I?
Succinctly, what I learned was that a bottom-up approach rewarded me. I could learn something, then make it intuitive. Then, build on it. Even years after learning a lesson, I might still recall the facts about it and apply them to a new problem.
Do I know everything at the lowest level? Of course not. But what I do know makes it easier to learn more as I go along.
Since everyone is different, I'm sure there are people who'd rather drill-down. (And, maybe, in some situations, I'm one of them. I'd rather cook a burger and eat than learn enough biochemistry to understand why adding peppers or mushrooms makes it taste so much better.) And people in-between, who'd rather study and get a grade and get out of there. Maybe some of my observations, presented in the balance of this essay, will be of interest to those types of learners.
If youre in one of those other categories, maybe you can still learn from the ground up by reading books like Write Great Code: Understanding the Machine, or Code.
Debugging
I've become convinced that one of the most important skills in professional software development is debugging. The series of articles I wrote for the now defunct Visual C++ User's Journal underscores that belief, I think.
When interviewing candidates, I try to ask them at least one debugging question to gauge their ability to think through a problem. It is too easy to bone up on trick questions or memorize a few algorithms. What you do when things go wrong, in whatever discipline you're practicing, shows your true mettle. Do you ask for help? When? Can you at least measure a few things and look for clues? What if you find good clues? Bad ones? Conflicting ones? What if the terrain doesn't match the map?
A great candidate will admit to me that they debug code that already works. To the inexperienced, they'll sound like someone who doesn't know that you shouldn't try to fix something that ain't broke. But how can you fix something that is broken if you don't know what it looks like when it is working?
If you're working on a very low-level computer system, your debugging options aren't always wonderful. You might have tools with some rough edges, or some very critical timing issues. You'll need to debug more with your brain than your keyboard. A good understanding of the way things work is going to be indispensable in those conditions. Your noodle will be your biggest weapon.
I think that debugging extends past stepping through code, particularly when considering the study of code that already works. It means watching that code and seeing what it does. Why might it hit the disk so much? Does it cause page faults? Where? Can they be avoided? Debugging also involves doing code reviews, thinking of symptoms and causes, and reassuring customers.
About Langauges
One of the more common questions here is about which language to learn; even which implementation of which language.
If I interviewed someone who told me they knew language Z, but didn't know language R too well, I wouldn't mind. We could have the interview in language Z. But I'd press them a little bit about what they thought of language R. They should know at least a little about it; what it is good for, what it's not good for, and so on. If they start ranting about how it was invented by an evil scientist who worked at a company that everyone hates, or that it "just sucks", or other such nonsense, I can't imagine the interview would last very long.
A professional developer isn't concerned with language; they're even less concerned with style in a particular language. They're able to pick up a language pretty rapidly, since it's just a different way to express the ideas they should already be very familiar with. As they gain practice, they'll get better and more fluent. But if they spend the first couple of weeks learning different things, that's no big deal.
Your first language, however, is a more interesting question. I stumbled into machine language simply because nothing else was reasonably available at the time. Young programmers these days don't have it so easy.
If I were starting, I'd think about a few attributes for the language I chose to learn. First, would it be something that's commercially acceptable? I don't mean the language that offers the most jobs in the paper; I mean something that has community support, that I can get help with and find books for, and so on. I don't see any reason to start programming with Haskell, or some other lab-rat of a language. C++, Java, Perl, Python, some assembler, whatever; they all are documented and viable.
Next, I'd consider the frustration factor. How much background do I need to learn about the tools in order to get started? Having a mentor at your disposal, or access to some books or courses, can significantly change the way you answer this question. If your buddy can help you download and setup the Ming kit or Visual C++, maybe you should do that. On the other hand, if you're doing it yourself and don't have much background, I can't imagine something more frustrating then looking through a list of gigabytes of zipped, tarred, archives. How would you even know you got it to work?
One of the langauges I think that's doing really well in this area turns out to be Perl. It's interpreted, so you don't have to sit through long, multi-step compiles. You can get immediate results without going through the edit-compile-test loop. Setting up the language isn't hard at all, and there are packaged installers that do the hard work. There's no shortage of books or websites. In fact, there's a book or two that teach algorithms with Perl, so you can study performance and tuning with the language, too.
Python is in the same boat, but it seems support for it is just a little less mature than for Perl. There might be other languages in this camp, but those are the ones I know. If you don't have the gumption to start out with assembler, then perhaps one of them is right for you.
Be wary of people who say that something's "too ugly", or "really stupid". Maybe they're right, but that doesn't concern you right now. Learning does. Make a note of what they said and gain some experience -- then, see if you can figure out why they held that opinion. If you have a little experience under your belt, you might try asking them about their opinion. See how many statements they make which are unsubstantiatable, and how many seem believable to you. (There's no shortage of religious wars about platforms, tools, and languages in this business. Since we don't certify engineers, there's no shortage of true experts.) Did listening to their beliefs teach you anything you felt like you could take away as fact?
Making Progress
Computer programming isn't easy. Many smart people can't do it, or aren't interested in trying. Some smart people learn enough to do what they need. Maybe they're using Fortran, or the Mathematica language.
Programming professionally -- making software that the general public uses, in whatever form -- is actually very hard.
As hard as it might or might not be, not doing it guarantees failure. And not continuing guarantees decay.
I learn Perl about twice a year. I just don't use it enough to remember what I've learned. I've got no problem starting again, and I remember more each time. But I still can't call myself well-versed in the language because I'm always rusty, and because I haven't solved any really big problems using the language.
Imperative to a successful start, then, is a project that's just the right size. Maybe your mentor can help you pick one depending on your abilities and the time and tools you have. Maybe it ranges from finding some small prime numbers to re-writing the Windows CALC program. It's going to be simple, though; if it isn't, you'll get hopelessly lost and frustrated. Your first project is certainly not writing a CAD program, or a chess game, or a MMPORPG, or anything that's going to set the industry on fire. So set your expectations realistically. You're not going to buy some paints at the art store and churn out The Mona Lisa on your second weekend.
What is important for your first project is that it interests you. If you like model rockets, make a program that estimates fuel capacity or predicts where a rocket will land given some parameters about its launch. If you like music, pick something simple: print out all the tags in your MP3 files and sort them, then print them out as text. Think of things that you'd never do by hand, but could do with a computer. Rip through every file on your hard drive, for example, and count them -- compute the average length, find the oldest and newest file, and so on.
If you start getting good, buy a book like "Programming Challenges" and work through it. Or try some of the exercises in an algorithms textbook. Heck, just implement the algorithms the book describes and time them with different data sets.
Don't bother jumping into Microsoft Windows, or the MacOS presentation manager, or X Windows at first. There's plenty of time for that later, and it'll all be easy to understand after you've got some notches on your belt.