The web has exploded with eulogies for Steve Jobs. I've never met the man, but I wanted to write a little bit of my experiences with Apple and NeXT computers (shown below*) and explain why my daughter's first computer is a Mac.
It's hard to explain to the youth of today, who grew up with ubiquitous computing devices and omnipresent network connectivity what it was like in the dark ages of computing. ;-) And, even I, who grew up with solid state electronics, was fascinated by stories of even earlier vacuum tube computers (and the armies of men and women who worked on them).
My story is that of a girl who grew up in the 1970s and 1980s in what later became known as Silicon Valley. Most of the stories are being posted by men and I wanted to write a story for my daughter because she is about the same age I was when I first started writing computer programs.
When I was in the 6th grade, my parents split up. I don't know how my mom found the money, but she sent my sister and I to a BASIC programming class at a local computer store. Back then, all computer stores were independent mom and pop operations that also served as a nexus for computer hobbyists. They offered meetings and classes nightly.
Most of the students were adults, but there was also a boy of similar age. His mom told me that, prior to marriage and motherhood, she had worked as an IBM programmer and wrote her code in machine language. I laugh when people tell me that programming in BASIC is too painful. (All that allocating and deallocating memory.) Have they programmed in machine language?
We didn't have money to buy our own computer, so we wrote the programming exercises solutions on paper and the teacher told us whether or not they would work. I don't recall if we were able to actually use a computer in the store to test our programs.
I got serious about violin and dropped computing until the 8th grade when I took it as an elective in school. The entire class of ~20 students shared a single Apple IIe. I had a leg up on the other kids because of my prior exposure to BASIC. Or, perhaps, I just naturally think algorithmically.
The Apple IIe was beige. Some marketing genius thought that beige would be less intimidating and more likely to be brought into people's homes. Perhaps that is an apocryphal story. But beige did reign for some time. Atari, IBM, Compaq, Apple--all beige.
I took another programming class in high school and we used terminals connected to the school district's mainframe. They allowed kids to work on the same system that handled the district payroll! The only reason that our high school was able to offer a CS class was because we were co-located with the district main office. None of the other schools offered it because networking was still in its infancy. Again, we programmed in BASIC.
In high school, my dad bought my sister and me an Atari 800 and we programmed it in BASIC. We also became very proficient in the Atari classic game, Asteroids. To this day, I cannot look at a bagel without thinking of Asteroids. (There are two options for handling the screen edges in Asteroids. The selection that makes objects drifting off the R edge appear at the L edge and off the top edge to the bottom edge, vs. pure reflection, is topologically equivalent to a torus aka a bagel shape.)
I should mention that personal computers of all flavors used to boot up very quickly because they had very "lightweight" operating systems without any modern doodads. You turned the switch on, and in under a minute, you were confronted with a flickering cursor on a blank screen. It was easier to get work done because you had fewer distractions. When you turned off the computer, all of your programs disappeared--unless you recorded it to an audio cassette tape. Reading your programs off the cassette was always exciting because, unless you synchronized the play button just so, you didn't start at the beginning of the file and you lost your file.
As a freshman at Cal, I took Programming for Scientists in Engineers and we wrote our code in Fortran. I continue to use Fortran to this day because so much science code is written in it. We did our work on terminals connected to a mainframe. I can't recall how we saved our programs. I look back now and see that I did fine in that class. But, for some reason, I came away from that class thinking that I was not cut out for computing.
By the time I was a junior, I could avoid computers no longer. Physical chemistry lab required us to write analysis and modeling computer programs. Although we were allowed to use a chemistry department computer lab full of IBM PCs (beige), the lab was closed on weekends and evenings. By then, we were saving our programs on floppy disks.
The CS dept computers were available longer hours than the chemistry dept ones. At the time, I was dating a guy in the CS department who would log me into a departmental computer so I could do my homework anytime I needed it and he was around. The only problem was that the CS dept computers ran on Unix and I hadn't learned it yet.
Fortunately, one of my friends taking the same class had previously held a workstudy job helping a biology professor work up his data. He spent the time to teach her Unix because her labor was free to him (the university, not the professors, paid workstudy students) and she took the time to teach me. We wrote our code in Fortran. My friend and I both did well in the class and I do appreciate the help from the guy who lent us his login id. That would be against university rules today and I don't want to get him in trouble. I also realized that I might not be hopeless at computers.
My senior year, I took upper division numerical analysis in the Math department. It was reputed to be a "weeder" class that forced many a student to switch majors. Oddly, I didn't find the coursework difficult at all. The social environment in the computer lab was the main hurdle.
Sun had donated a batch of beige Solaris (a flavor of Unix) workstations to Cal and the university installed it in a the basement of the Math building in a warren of rooms called the Web. Each little room contained ~4 workstations and there were about 30 workstations in all.
Contention for resources were high, but the department did not enforce time limits. Most of the machines were taken up by male students playing games and the lab proctors refused to boot game players off to let us do our homework.
Moreover, the boys had installed Sports Illustrated swimsuit images as screen backgrounds. Imagine sitting in a small semi-enclosed space with 3 young males, all looking at 20" screens displaying T&A (tits and ass). Would you feel safe? Now try to focus on your homework.
I asked a lab proctor once if he could get the boys to remove their screen backgrounds and reinstall the default one from Sun. He said that it was not his business to police behavior. The boys were in there for hours and the place smelled like them, too. It was a really, really hostile place for girls trying to do homework.
A female classmate and I banded together for protection. We would set our alarm clocks extra early and show up at 6 AM, just as they unlocked the lab for the morning. We had the place to ourselves and got our work done. By the end of the semester, we noticed that the other girls had developed the same coping strategy. The downside was that, at 6 AM, there were no teaching assistants or lab techs around if you had trouble with your code or your machine. We helped each other. We were allowed to program in either C or Fortran and I turned in my homework in Fortran. I also shared my rudimentary Unix skills with the other girls.
By the time I started grad school at CU Boulder, I was feeling pretty computer-proficient. A friend introduced me to one of the Unix help desk workers at CU and I visited him occasionally both for CS help and socially. One day, in 1988, he was playing with a big black monolith on his desk. It really did remind us of the monolith in 2001: A Space Odyssey with it's sharp edges and glossy dark color.
He said it was a NeXT (pictured above) loaner for review. I asked how it differed from other Unix workstations. He pointed to it and replied, "It's black, not beige." Actually, the color choice hinted at other things that were different under the hood. Overall, my friend was very impressed with it and was sad when NeXT failed to gain traction and died.
In grad school, I took a class in Unix system administration and was a co-admin for my research group's workstation. After graduation, I took a more advanced Unix sysadmin course at my current workplace so I can do my own care and feeding of workstations. I also picked up several more programming languages. I made my first computer animation in 1990. I made my first website in 1994. If I needed to do something that required new computer skills, I would pick up a book (or several), ask friends for sample code and just do it. I did all of these on Unix systems. I even wrote my PhD thesis with LaTeX on Unix systems, but I did make some of the figures on a Mac and export them in postscript for ingest into LaTeX.
Then Bad Dad said that he only wanted one operating system in our house and we should go with the dominant PC DOS/Windows. I was never very happy with Windows because I never knew where it was putting the DLLs. If you don't know where your executables are going, you aren't practicing safe computing. Anyway, I feel more at home with Unix/OS X than with Windows.
So, when Apple introduced OS X, which is built upon Unix, I agitated to get a Mac at home. There is something so reassuring about being able to open up an xterm and see a command line with a blinking cursor. It gives me a kind of warm milk in a cozy kitchen type of feeling.
OS X comes with many compilers for all sorts of different languages. You can download more by installing Xcode. You can download even more from SourceForge. When you compile a program, the executable stays right there in the directory where you invoked the compiler. Or you can redirect it to a place of your choice. But you know where things are going because YOU control it. It's not as clear on a PC.
[I am really mad at Apple about the marginalization of Xcode. It used to come on a disk with every system. But then they stopped giving customers a disk and told us to download it from the Apple website. Now they charge $5 to download it. $5 doesn't sound like much, but this means that you need a credit card to get it. Kids don't have credit cards. So now they've placed a barrier in front of kids who want to learn how to program on Macs. And why can't a kid write an iOS program and load it up on their own iPods without joining the Apple developer program for an additional $99?]
I wanted to teach my daughter how to write computer programs. I didn't want her messing with my computers so I bought her her own 13" MacBookPro and set up Xcode for her and bought her introductory books for Unix, BASIC and Python.
Unfortunately, the web has too many distractions. Sitcom humor on Disney.com is more alluring to her than a blank screen and a blinking cursor. When I was her age, I had less access to computers, but I knew more about programming and the inner workings of a computer than she does.
How do I turn her from a consumer to a creator?
* I snagged the picture of Tim Berners-Lee's NeXT computer from Wikipedia. It is the world's first web server.
Have you tried something like Scratch? Also, my hubby has a development environment (calicoproject.org) that uses several languages. It's free. It includes examples, and it's pretty easy to use. It's still under development, but mostly that means new stuff gets added all the time.
ReplyDeleteThank you for this.
ReplyDeleteHave you and your daughter checked out App Inventor? It's free, and now housed at the MIT Media Lab (home of Scratch). App Inventor lets you create applications for an Android device in a mostly drag & drop environment, so that syntax is not an initial roadblock. (You don't actually have to have an Android phone, as there's an emulator you can use.) I've seen middle school and high school students go from 0 - 90 MPH with this, learning programming logic and whatever math they need as they go along, so that they can make the apps they need/want/find cool. Some tutorials to get started are on the appinventorbeta site. There's a really good book by Hal Abelson with even more tutorials, and then a good section of background and theory which lets you move beyond the "cookbook" or variations-on-the-tutorial approach. (As you can tell, I'm a really big fan - programming for the computers we carry in our pockets is enchanting/ challenging/ compelling and has led several students I know to say, "I like doing this. I'm going to go into CS.")
Python - which I haven't really taught with - has lots of fans and lots of learning resources on line as well...
@Laura and @Elizabeth
ReplyDeleteThanks for the suggestions. I will check them out.
Iris and I need to learn Python to do our Mommy and me AI class homework.
http://badmomgoodmom.blogspot.com/2011/09/mommy-and-me-ai.html
Actually, I could do it in a number of languages, but she needs to learn one. I heard at work that Python was relatively easy and ubiquitous (it comes w/ every Mac even w/o Xcode).
One time, at work, a tar ball (tape archive file) containing 1.3 Million SLOC (single lines of code) written in 5 languages (Unix scripting, perl, C/C++, Fortran, IDL) was dropped on us. We had 30 days to evaluate the quality of the code and its suitability for rework and reuse. At times like that, I am really glad that I can read so many programming languages.
my boys have tinkered with Scratch, but it hasn't really taken off.. in their case Minecraft seems much more appealing. I did get to walk through installing new classes in a .jar file with my 10-year-old, for some patch or other, so it's a start I suppose. The App Inventor is a good idea, thank you Elizabeth - I"ll try them on that..
ReplyDeleteLots of good resources for Python learning in this post at Sean's place,
http://blogs.discovermagazine.com/cosmicvariance/2011/09/08/python/
Your experience in the basements of Cal with the smelly boys seems to be common to all the women scientists/programmers I know, who started in programming after workstations/pcs became available. My undergrad experience was with card decks (eek) and later green IBM screens, Fortran, assembler, etc. The environment was much more friendly to girls, particularly as several of the professors were women. It makes me very sad to hear tales like yours..
Python is huge in bioinformatics, but I'm old, so I know Perl instead.
ReplyDeleteI didn't turn into a creator on a computer until grad school, although I took programming classes from time to time throughout my education. For me, it was having a problem that I wanted to solve that made me make the transition.
What frustrates me the most about the story of the computer basement at Cal is that those same boys probably assumed then (and perhaps still assume now) that the only reason there aren't more women in their field is that "women just aren't as interested in computers."
@Cloud
ReplyDeleteI forgot that the tarball contained Perl scripts, too.
Basically, chron jobs kicked off Unix scripts that ran Perl scripts to see if all the data had arrived yet. If they had, then the data crunching commenced.
So it included Unix, Perl, C/C++, Fortran and IDL. Whew! Job security for us old-timers.
"How do I turn her from a consumer to a creator?"
ReplyDeleteWhen I was a high school student my school offered BASIC programming courses on Commodore CBM computers - the ones with the full size keyboard. This was a huge and recent upgrade from the Commodore PET computer which had a keyboard that must have been all of 6" wide. We thought it was the height of computing power to have a shared set of floppy disk drives, but mostly kept our classwork on audio tape. Our goal was to play games but none of us owned any, so we often programmed our own.
Later, the school offered Pascal programming on Apple IIe computers, which I took. Games were available at that time, usually pirated, but a certain level of sophistication was required to interact with the computer and if you wanted to copy a game disc you had to have at least a working knowledge of assembly language (in order to find the code that checked for an original disk and NOP it out.)
However, by the time I left high school or perhaps soon after, the programming offerings had dwindled and the school was offering word processing classes and games were abundant on both game consoles of the time and via programs like Locksmith that didn't require a high level of knowledge to pirate a game disk.
Does the rise of increasing levels of abstraction and sophistication in computing technology naturally decrease our creative urges? I would argue yes, because the easy itches have been scratched and it feels pleasant - For example, a text editor / word processor comes standard with computer OS's now, but that wasn't always the case and at one point I spent a bit of time trying to write one for my Apple II+.
So I would argue that the way to turn a consumer into a creator is to generate an overwhelming urge to create. Since all the obvious creations have been done, you need to find a niche that is not obvious. Can a school science project motivate some modeling or demonstration that doesn't already have a canned solution preloaded on the computer?