Archive for the ‘Uncategorized’ Category

Making Jean Bartik’s prediction come true

March 28, 2011

Jean Jennings Bartik died last week. If you are one of those people who are asking “who is Jean Bartik” then you might consider reading the obituary at CNN. Although she was born, raised and educated in Missouri, I found no mention of her in the St. Louis Post Dispatch, even in the “national spotlight” section of the obituaries. Lenin’s niece made the list, but not Bartik. Isn’t it amazing that the passing of an individual who made a significant contribution to the development of computing and to winning WWII didn’t make news?

Jean Bartik was one of the six women who programmed the first general purpose computer, the Eniac. But, according to Bartik, “for many years in the computing industry, the hardware was it, the software was considered an auxiliary thing,” and so neither she nor her five colleagues got much credit. In fact, when they were showing off the Eniac, managers downplayed their role, simply making them out to be models who posed in front of the machines. (you can learn more about them at the Top Secret Rosies website or the Eniac Programmers website).

According to CNN, in February, Bartik said women hadn’t gotten far enough in technology, but she saw a promising future. This comment made me think — why? The first programmer, Ada the Countess of Lovelace, was a woman. These six women played a significant role in early computing as did Grace Murray Hopper. Why do not more women see them as role models and chose IS as a field for their careers? And, what can we do about it to make that future more promising for the next generations of women?

As most people who know me will attest, I am an advocate for getting more girls and women into technology. Why? Because it is a great field and the field needs them! I am advocating a balanced advocacy for women that is best represented by an article I read a few weeks ago: What is a Woman in Tech? When I say “advocating for women,” I am talking about educating women about the range of opportunities associated with an IS degree and giving those women a fair chance at the field. It does not mean preferential treatment or advancement simply because people are women. It means helping to eliminate the IS workplace that is hostile and building a supportive community.

Does that mean I want less opportunities for my male students and colleagues? NO! Putting a woman in a position for which she is not prepared only makes the whole problem worse for everyone!

Last week I also read a study out of University of Washington which shows girls are already picking up on the stereotypes of what is and is not an appropriate career path for them by second grade. WOW! How can anyone know what they can and cannot do by the second grade?! How can we fight the stereotypes if it starts that early?

I, unfortunately, do not have a solution. I believe the proactive efforts of all of us are needed to make the accomplishments of all of the women, from Ada to those on the front line today, known to young women. I believe we need those same proactive efforts to help those young women see the possibilities that are available to them. Finally, I believe we need those same proactive efforts to make the field welcoming of those women. It takes all of us to change the tide of decreasing numbers of women in the field. It takes all of us to make Jean’s hope that the future is promising for young women in computing.

The Essence of an IS Professional

August 23, 2010

Recently I read the blog of Nicole Sullivan-Haas, who uses the name Stubbornella (http://www.stubbornella.org/). I don’t know why she uses that name and it is not a blog I generally follow (but I may start). In this particular entry, she is discussing women in technology. But, that is not the part to which I want to direct your attention. Rather she provides a nice dichotomy of the difference between good developers and bad developers (the specific blog is at http://www.stubbornella.org/content/2010/07/26/woman-in-technology/).

The code cowboy

* Stays up all night recoding the entire code base, documents nothing, and forbids anyone to touch it because they aren’t good enough to understand his level of code.
* Refuses meetings, chats, or any other form of communication.
* Cares more about being perceived as the brilliant-uber-genius than he does about his team working well together.
* Gets into silly pissing contests which boil down to “hehe, my brain is bigger than yours”.
* Finds complex solutions to problems, thus proving his brilliance.
* Makes a lot of mistakes due to lack of sleep, overcaffination, and ego — but thank god he is around to save the day when the bug is discovered.
* Is fairly certain clients, PMs, designers, and really anyone he has to deal with on a daily basis is at least three standard deviations below his IQ.
* Jumps to say “me, me, me!” when credit or rewards for accomplishments are offered.
* Jumps to say “me, me, me!” when opportunities to attend or speak at conferences arise. The good developer

The good developer

* Digs the fact that he is making products for people. Likes people and enjoys communicating with them and understanding how they think. Can put him or herself in other people’s shoes and reliably imagine how they might react to different parts of the UI.
* An excellent problem solver who takes into account all aspects of a challenge when designing a solution – including human elements like maintainability and usability.
* Shares credit with the entire team or entire internets. Recognizes that no solution evolves in a vacuum.
* Applies consistent effort and recognizes that working in a way that promotes long term productivity will yield better results.
* Respects the members of his team, including those who aren’t engineers.
* Manages projects so they don’t require super human feats of sleeplessness to meet deadlines.
* Has a life outside of work, other interests, friends, and family — they love code, but they love lots of other things too. If you don’t understand how this makes them a better developer, see item #1.
* Amazing capacity for abstraction and creative thinking.

This is a reasonable view of the dichotomy of technology professionals. It particularly appeals to me as I face a new semester with two sections of “systems analysis.” One of the major purposes of the class is to transform people who are in the first column into people in the second column. Believe me, sometimes it is easier to turn lead into gold!

The goal of an Information Systems degree (in contrast with a computer science degree*) is to focus on how the computer is helping the enterprise. The goal is to set the business priorities first and see how computers can reasonably help the enterprise meet those priorities faster, more cheaply and with less stress. In order to be successful, IS professionals must understand the business better than the people in the professions. This is why we require all of those business courses. It requires an understanding of where the business is going and how the system needs to support that growth.

Any professional will want to optimize the product he or she produces – make it bigger and better than anyone else has done before. Sometimes, however, that means that it costs too much or takes too long to produce. Instead, it needs to “satisfice” – to be good enough given the constraints on the system. As a profession, we don’t do that very well. The one kind of constraint that we do not process very well is that of the human component. In particular, what can we expect that human to do and to know and what will that human expect of the system. Said differently, as IS professionals, we need to know how the customer thinks and make sure that the system responds to that well. As a profession, we need to get past the code cowboy behavior and show empathy for the client, and show creativity in our solutions.

So, what’s my point? First, for all of you who are not in IS because you think you must be like the people in column one above, PLEASE change your majors and join us – we need more people of the type in column two. Second, for those of you who want to know how to practice the profession better, focus on the first point in column two – how can you make the system work better for the business, including the people, who work there? Third, of course, if you have any advice on how to transform people from type one to type two (or to transform lead to gold for that matter), please share!

Kindergarten Engineering

June 16, 2010

An article in the NY Times today discussed a new movement to teach engineering to primary students. According to the article,

Supporters say that engineering reinforces math and science skills, promotes critical thinking and creativity, and teaches students not to be afraid of taking intellectual risks.

Clearly, my next step was to check out the available modules; I was disappointed to find there were no modules addressing information technology. It was an interesting set of topics, though, set in different contexts and different countries. I looked at the industrial engineering module (I do, after all, have a BSIE). The module introduced the topic of how machines make work easier, a traditional topic in IE. The materials include a children’s book (which chronicles two young girls’ trip to the potato chip factory where they learn how machines make work safer), a teacher’s manual, a DVD with vignettes about machines and a set of materials. Not a bad collection.

So, on and off today I have been thinking about what activity could be developed to help young children appreciate programming or other aspects of technology. I thought about creating a “human computer” where only some children could compute and others could write on the board and others carried messages, etc. Would that help them understand it? What kind of book would go along with that topic?

Then I thought about having them explain precise instructions of how to do something. The tasks in the program, according to the teachers interviewed, were to teach the children to “take students step by step through the engineering process: design, build, test, evaluate.” Well, those are good things to learn. But, how to apply them to information technology?

I still don’t have an answer, but I do have a question. Does anyone else have an answer?

Babbage’s Difference Engine Number 2

April 22, 2010

I love history. I do not read as much about history as I should, but I love to visit places, learn their history and, of course, visit their museums. In addition, I am a geek. I love computers and what they can do. In fact, I like these two subjects so much I started a computer museum here at the University called Grace’s Place. So when I had the opportunity recently, I jumped at the idea of visiting Babbage’s Difference Engine. To many, this engine is the beginning of computing as we know it. Certainly Babbage’s later machine, the Analytical Engine, is a computer in the modern day sense of the word. I would love to see how a 19th century inventor thought of computing, but Babbage never built his Difference Engine. However …..

Charles Babbage was a mathematician, inventor and philosopher. He was troubled by imprecision. He hated imprecision so much that he once wrote Lord Tennyson to note the imprecision of his poem, “The Vision of Sin.” In that poem Tennyson said, “Every minute dies a man, Every minute one is born.” But, Babbage wrote that the poem could not be factual or the population of the earth would be flat. He suggested instead that Lord Tennyson change the poem to read “Every minute dies a man, And one and a sixteenth is born.” Further he said “I may add that the exact figures are 1.167, but something must, of course, be conceded to the laws of metre.”

But, I digress. Back in the 1930s, all mathematical tables were computed by hand. (I realize that my students will probably have never seen a mathematical table since they now get all of those values directly from some computing device. They will just need to trust me that there was a time when people did arithmetic by hand and people used tables.) These tables were critical to computations needed for building, astronomy and navigation. Even minor errors in the tables could cause buildings to fall and sailors to be lost forever. And there were many errors. So Charles took off to create an “engine” that would do the computations exactly. How he came up with the idea of the Difference Engine, I cannot imagine. But, Difference Engine 1 was designed to include approximately 25,000 parts, weigh fifteen tons, and stand 8 ft high. He took his idea to the British Government for funding and they too thought this was a good idea, so they granted him funds to build the engine.

Unfortunately, Babbage was not a good project manager. He was, after all, a perfectionist, and so his project suffered from scope creep as he continually improved on his design. In addition, he was not an easy man for whom to work (remember his writing to Tennyson to change his poem?), and he suffered some significant losses close together. Needless to say, he did not get the project finished.

Learning this is important for two reasons. First, it was a surprise to me because I had learned that the problem was with the precision of tooling in the 19th century. Apparently that was not true. Second, think about how many inventions and developments never happened because of bad planning. It has been said that this first experience with calculating devices so soured the British Government that it would not fund projects of this type for some time which, in turn, stunted the development of calculating devices. I wonder how much more advanced our capabilities would be had Babbage been a good project manager!

Babbage was not discouraged and he went and designed Difference Engine 2. This one was smaller and had less moving parts, but the British Government had no interest in funding the project and he could not get funding from any other source. After his death, Babbage’s son gave the plans for this engine to the London Science Museum. In 1989 (about 150 years after its design), the London Science Museum decided to try to build Difference Engine 2. They used Babbage’s plans and 19th Century manufacturing precision. Where Babbage’s plans were incomplete, the builders used other functionality that Babbage had designed. In the end, it was all Babbage’s design. It was tested at the London Science Museum and returned the answer for a polynomial to 31 digits correctly!

It turns out that there are two Difference Engine 2 in existence. The London Science Museum ran out of money before it could finish the original piece; they had built the engine but did not have the funding to build the printing component (on the left in my photos) of the engine. They approached Nathan Myhrvold for funding. He provided the funding with the stipulation that they build him a replica of their engine.

Myhrvold agreed to display the difference engine in the Computer History Museum in Mountain View California for a year after taking delivery. It actually has been there longer than that because Mr. Myhrvold needs to fortify his living room floor before he can move the engine there.

So, during spring break my husband and I visited the Computer History Museum and Babbage’s Difference Engine No. 2. This is like visiting history that even the people involved never saw. It was a great experience.

I must tell you that it is a beautiful machine! The bronze set off from the black frame is just lovely. But all of that paled next to the idea that this piece of equipment was designed almost 200 years ago. It was like going back into history and seeing what inventors were doing. One can almost imagine James West and Artemus Gordon using this engine on the Wild Wild West (this was a television show that had detectives using fanciful inventions of modern conveniences as they might have looked in the Victorian Era).

We were lucky enough to be at the museum when they were demonstrating this engine. Someone literally cranked the numbers! Remember, Babbage didn’t have electricity. So, he had to control for sloppy cranking too. One put weights on the structure to control the polynomial that you wanted and cranked away. In the front you saw the gears moving around. But in the back you saw levers and switches that controlled all of the carrying that was necessary for the computations. It was captivating. At the end, the printer gave us the value of the polynomial and was ready to start again. If we wanted, it would also prepare a mold for a copper plate so the values could go right from the computer to the printing press. Just imagine!

To me, the experience was a lot like finding a long lost Renoir or Monet painting that no one knew about and having the privilege of being one of the few people who could enjoy the experience. It was like being a little kid again and being amazed. Yes, I know that makes me a geek.

But, go ahead and look at the machine. I have posted some photos in an earlier blog, “wordless Wednesday” and watch this thing work. Direct your browser to the Computer History Museum and watch the engine work. You too may be captivated. For more information about the history, there is a YouTube video. See the process of unpacking the machine when it arrived in Mountain View. Also watch the program from the opening of the Babbage Engine exhibit at the Computer History Museum.

Yes, it is history. Why is history important? For two reasons. First, it is important to see how far we have come in computing in order to project where we might go. Second, it is important to “get into the head” of great inventors to help ourselves learn to invent better so that we can solve the many problems of the 21st Century. In this case, the history is also beautiful and mesmorizing. Check it out!

My Clock

April 12, 2010

As anyone who has been to my office can tell you, I have a “thing” about clocks. It all started with one I bought at an art fair that was nothing more than a kit you get at Michael’s used to make an old hard drive controller card into a clock. But, then I saw a binary clock and bought it. Then I found a clock that is driven from computer boards and shows on old nixie tubes. And, I have a digital clock, but instead of numbers, it lights up squares in the four positions — oh yea, and the squares that are lighted change every second. I added one to our lab that runs backwards and has the line “think creatively” next to it. My clocks are fun. They don’t get me places any earlier, but they are fun none the less.

I had a birthday recently (and no, I won’t tell you how many). I want to brag about the gift my wonderful friend Margaret gave me — it was a clock. But, given my collection, not just any clock, would impress me! Margaret knitted me a clock as shown in the photo below.

She used all the colors I love and demonstrated her amazing knitting skills. I could not replicate this clock in a knitted form for all the tea in China.

But, look at the clock straight on as shown by the photo below. Do you notice anything about the numbers?

Yep, the numbers on this knitted clock are in binary! Isn’t that amazing? Can you make them out? Well, for those of you who cannot do binary quickly, I have given you a quick key below.

1: 0001
2: 0010
3: 0011
4: 0100
5: 0101
6: 0110
7: 0111
8: 1000
9: 1001
10: 1010
11: 1011
12: 1100

So Margaret gave me a perfect gift! It is a unique work of art. And, it has a technology theme. What could be better for me? The only thing better is a perfect friend — which, of course, Margaret is!

Ada Lovelace Day

March 24, 2010

Today is Ada Lovelace Day, a day on which we celebrate women in technology.  The intent is that we all blog about women in technology whom we admire not only to celebrate their achievements, but to provide role models for others.

As I write this I can see my male students and colleagues shrug their shoulders, shake their heads and mutter something about this being women’s history month.  So, I believe I should begin this essay with an answer to the question, “why should we care?”  That’s easy….

The IT field suffers from terribly high failure rates for systems.  I could cite statistics, but I suspect that no one reading this needs to be convinced that software projects often run over budget, don’t meet users needs or just simply don’t work.  The most commonly cited, of course are the Chaos studies that estimate between 20 and 50 percent of all projects are failures.   Specifically, they estimate that about 18% are total failures and another 53% are “challenged.”  There have been a myriad of solutions proposed to address the problem of system problems.

I would propose that the problem is not the methodology or tool or even upper management’s support, but rather the mix of the team that is developing the product.  Right now the majority of the systems developers are male.  Worse yet, the number of women in the pipeline shrinks every day.  When I began in this profession, women were happy because it was an even playing field compared to other disciplines in engineering. Women flocked to the discipline causing it to be at some point about 30% women.  Today, women are shying away from the discipline.  As I write this I know that I am about to go teach a class that has no women in it.  In my early years of being a professor, that never happened.  Today I see it more and more.

So, what I suggest is that addressing the goal for better technology is best done by increasing the diversity of the teams that develop it.  Countless studies show that diversity in a team leads to better products.  As a recent Ernst & Young report points out, a group of intelligent problem solvers chosen at random will outperform a homogenous group of even the best problem solvers, under the right conditions.  Will it work?  Well, teams comprising men and women produced the most frequently cited patents – with citation rates that were 26 to 42 percent higher than the norm for similar patents (i.e. diversity promotes innovation).  Companies with the highest representation of women in their senior management teams had a 35 percent higher return on equity and a 34 percent higher return to shareholders.  And, we have only to look to our colleagues in production to see that it is true.

There are lots of theories as to why women do not select the profession and it is up to all of us – male and female – to think about increasing the percentage of women in the field.  It is not a “women’s problem” it is a “better technology problem” that belongs to all of us.  So, that is why days like Ada Lovelace Day, where we celebrate the women who are in the field and attempt to attract more women are so important.

I want to celebrate all of the women in technology whom I know.  I know countless women who have wonderful careers and I could not possibly enumerate all of them here.  What I want to do, however, is to celebrate a trait that I see in the ones that are happiest and most successful.  That trait is to “do it your way.”   Early on, when there were few of us, we attempted to blend in, to be “one of the guys.”  Now there are enough of us that we can each celebrate our own individuality and gifts.  As we celebrate our differences, we begin to understand how those differences can be channeled to bring about the improvement in technology.  It is through those different views that we can see the problem, and thus the solution, for what it is.

So, I celebrate Mary Fowler who approaches computer problems and system design in an almost “psychic” approach.  I am not sure she could ever chart how she sees the problems or decides on the solutions, but she is brilliant in her analysis and design.  I also want to celebrate Sheila Burkett who approaches problems in a more traditional way with specific plans and procedures.  She can tell you how she got there and she is also brilliant in her analysis and design.

Those who are best, find that part of the field that they like and then practice the Shakespearean quote (from Hamlet): “This above all: to thine own self be true” (emphasis added).

Personal Privacy

March 18, 2010

My undergraduates are increasingly teasing me about my reluctance to keep information online. In fact last week they had me wondering if I was, in fact, getting too old for the field. I used to be one of the people close to the bleeding edge – am I getting too conservative for that position now? Does that mean I can no longer be an effective teacher and researcher? In other words, I was having a bit of a crisis of confidence.

Then two things happened. First, I spoke with my son (who is always a source of wisdom) about my quandary and found that my concern about privacy online is not as conservative as his. So, maybe it is not my age? Second, I read a Time Magazine article about how we have the largest generation gap in history and it is all about this question of how much should we put online. OK, so maybe it is not just me?

Then this morning while reading the New York Times, I felt my confidence return. In an article How Privacy Vanishes Online, the authors discuss how people can deduce data based on the personal information posted on social networking sites – not only one’s own, but also those sites of one’s friends. In fact, it states, “Computer scientists and policy experts say that such seemingly innocuous bits of self-revelation can increasingly be collected and reassembled by computers to help create a picture of a person’s identity, sometimes down to the Social Security number. “

The article went on to remind the reader of last year’s MIT experiment during which students were able to identify with 78% accuracy the sexual preference of students based upon their profiles (which, of course) were stripped of that information. A more recent example is the recent decision of Netflix to cancel their second competition because although the records were stripped of identifying information, individual users could be identified because of their unique viewing history.

People judge you by the information you share and by your friends. This information can be put together in a distinctive “social signature” (as the researchers quoted in the article identify it) that can be used for good, bad an neutral purposes. We tend not to look at our social signature in the same way that prospective employers or others might.

I worry that we have a generation of people who are growing up putting all kinds of things online that will come back to haunt them in later years when they least expect them. Once online, of course, things cannot be taken back. They exist there forever. Just look at our web presence as identified by the Way Back Machine (see http://www.archive.org/). All those old web pages that may or may not have things on them that we would not want now (and certainly have design flaws that we would rather not see again) are available for anyone to view. Can we say there will not be a parallel site for facebook, twitter, and other social networking sites?

So, after having my crisis of confidence, I have reassured myself that the problem is that I have more knowledge about what can be done with the data than my students, so I can see possibilities that they cannot. In addition, I have the wisdom that comes with more years to see how one’s personal views change.

I used to tell my students that it was critical that they manage their careers, starting from what courses they take. I will now add that it is critical to manage their social signature so that they can have a career to manage (among other things).

Why worry about diversity in the IS Field?

March 18, 2010

Alpha Trade Finance is a publication that I never read. However there was a link in a Mentoring newsletter that caught my attention: “Ten Things Companies – and Women – Can Do To Get Ahead.”  As advertised, it really does identify 10 things companies can do and then 10 things women can do to get ahead.

Many of them are things we all have heard before. However, the first item needs repeating:

Work toward “Functional Diversity”: Professor Scott Page of the University of Michigan uses this term to capture the idea that we need people with diverse ways of perceiving problems, rather than groupthink, in order to devise better solutions. As a recent Ernst & Young report points out, a group of intelligent problem solvers chosen at random will outperform a homogenous group of even the best problem solvers, under the right conditions.

    That caught my attention because it identifies the primary reason I believe we need to increase the number of women (and other minorities) in the Information Systems field. The field suffers from terribly high failure rates for systems. Study after study has shown that diversity leads to better products. So, it is in everyone’s interests that we increase the diversity of the IS field. Don’t think of it as a “women’s problem” …. think of it as a “better technology problem” that belongs to all of us!

    Cybersecurity

    January 26, 2010

    In today’s New York Times there was an article entitled “In Digital Combat, U.S. Finds No Easy Deterrent.” This article discusses simulations run by the Pentagon of how to respond to systematic cyberattacks. The simulations were run in a response to the hacking against Google and 30 other U.S. companies that have been in the news recently. The result of the simulation?

    The results were dispiriting. The enemy had all the advantages: stealth, anonymity and unpredictability. No one could pinpoint the country from which the attack came, so there was no effective way to deter further damage by threatening retaliation. What’s more, the military commanders noted that they even lacked the legal authority to respond — especially because it was never clear if the attack was an act of vandalism, an attempt at commercial theft or a state-sponsored effort to cripple the United States, perhaps as a prelude to a conventional war.

    The implications for national security are scary. One participant in the game admitted,

    “The fact of the matter,” said one senior intelligence official, “is that unless Google had told us about the attack on it and other companies, we probably never would have seen it. When you think about that, it’s really scary.”

    But there are smart people working on this problem, and so eventually I believe (ok, I hope) it will be solved.

    However, what no one is discussing is the implications of this to cloud computing. The idea behind cloud computing is that you keep your data and programs and such residing on someone else’s computer, or “in the cloud.” The data, the programs and all else is available through the internet. OK, sounds like a plan. BUT, what happens if these cyberterrorists attack your company or attack the cloud. Then everything that you need to run your business is suddenly unavailable. Isn’t that scary too? Are companies prepared to take this risk — especially after the Google incident? Are cloud companies planning for this kind of problem? How are people responding? I think this needs to be part of the planning process — especially in light of the dire results of the simulation.

    Share your views!

    January 24, 2010