A Little Perspective Please!

We’re in serious yoghurt.  “Unfettered AI (Artificial Intelligence) will (well, could) wipe out humanity.”  see here, for details.

The time is somewhere between 2040 and “in a few centuries”.  see here.  All of this is coming from VESFs (see previous post), some earned, some self-proclaimed.  Some, searching for government grants.  (Sell the sizzle, not the steak, right?)

Cynical, sure, why not.  How about, “Watson can unlock the vast world of unstructured data” here.  My “government math” calculates that to be about “half” right.  Ok, enough.

Back to basics.  Computers (the engine behind and driving all of AI) are really good at calculating & moving data around.  It’s impossible to really conceptualize (at least for me) how fast.  See an earlier post, Computers, People: Why can’t we just, well, talk? for a way to visualize that speed.

“Us folks” OTOH, are able to think and make judgements, albeit sometimes flawed, on very incomplete information.  Recently, we were having dinner with our grandkids, and I asked Sarah (10), if she would get a folder that I had put on the piano bench. It wasn’t on the bench, but on a music stand by the piano.   No instructions, she just looked around, saw it and brought it back.  Didn’t even mention that it was not on the bench.

Try that with a robot. Not impossible, for sure, but it would take a lot of design work, programming, and testing — to say nothing about the physical part of walking searching, recognizing, etc., that would be needed.  Robots, computers need specific instructions — very specific!

Getting computers, mechanical devices, and people to work together effectively is tricky, but as you can see here, (Tesla video) amazing progress has been made. But understand, that compared to activities that we do everyday in an unstructured world, all of those processes are very simple — even trivial.

As to AI, check out Michio Kaku here.  The operative line is, “The current state of AI is that of a retarded cockroach”.  A bit strong, good for headlines. Still, take over the world?  Have to wait awhile.  Send me $10, I’ll give you 10 to 1 odds it won’t happen.

So much effort is spent in trying to simulate the human brain.  ‘We just need more speed and more data”.  Maybe, but check out those old movie clips of the earliest airplane attempts.  They had flapping wings.  How’d that work out? Trying to simulate nature isn’t always the best approach.   All of us learn a set of complex tasks (walking, speaking a language or two, etc.)  without formal classes or government grants.  Our brain is more a learning machine than a computer.

Isn’t it more reasonable to think that our “learning machines” will do just that? Figure out how to use the enormous power and potential of computers and robots, to our advantage?  We have lots of time.

Then there is the current problem of what to do about social media and our kids (us, too).  But that’s another story.

Later.

Looking Ahead, (with 50 some Years Experience) — Why I’m starting SuperSOLVRS

Here’s the future as I see it.  (Think cell phones.  Started in 1973 and now more cell phones than people.  Uses no one ever imagined.)

1.  Technology will be even more encompassing and invasive.
2.  Robots (e.g., machines) will take over many jobs, starting with the most routine procedural.
3.  There will be an increasing job demand for technical skills, particularly computer programming.
4.  Computer programming from simple to complex will be needed to deal with all of the existing and new technology.
5.  Increasing demand for problem solving skills — especially those that determine what parts of a solution can be computerized.

Now the “elephant in the room”: Are our current educational institutions up to the training/preparation task?  I don’t think so — at least as presently constructed. Lots of folks agree  Why?  Several things:

1.  About 50% of tax $ meant for schools currently go to pensions — and that % will increase.
2.  Schools work is a sea (quagmire?) of regulations.  Example: The controversial “No Child Left Behind Act” has been replaced by ESSA (Every Student Succeeds Act) Better? Maybe, but still extensive and complex.  My point is that to get a new course approved is difficult and very time consuming.  We’re talking years.
3.  Current and increasing emphasis on “soft” subjects and Social Emotional Learning.  Interestingly, budgets for the Arts are also decreasing. that’s another story)
4.  K-12 school teachers are rarely professionals in the field that they teach.

OK, now the crucial question: “What if the “technical needs change faster than the time needed to create appropriate courses — and get them approved/implemented?”  Answer:  “We are toast”!  Probability of that happening? about 100%.  Again, think cell phones as a small example!  Wait, there are courses for cell phone use.  Where?  Not in schools!

Unfortunately, my crystal ball is in the shop for its annual maintenance and polish, so I can’t give you an accurate time line of future events.  What’s going to happen and when?  My best answer is, “I dunno”.

But, whatever happens, I’d bet serious money that two broad skills will be needed:

1.  Problem solving skills and computational thinking (sort of the same thing)
2.  Knowing how computers work — better with programming skills.

 by CMN   What’s This about?

1.  For middle schoolers (5-8 grades)
2.  Learn professional problem solving methods (a la Polya).
3.  Learn professional program design process (90% language       independent).
4.  Learn to fail successfully.
5.  Build intellectual foundations — be prepared for “whatever” career.
6.  Understand how a computer works using it’s own (machine) language. Follow with Scratch and Python.  Develop some proficiency.
7.  Learn how to work in teams.
8.  Learn how to learn — both with and without formal instrucion.

Making this all happen is not a spectator sport.  Learning requires do-ing. Just like learning a musical instrument.  Big difference between just listening and playing.  Playing requires proper training, and lots of practice.

We’ll start with a week (5 days 1.5.hrs/day) this summer and then on a once per week basis after school.  Lots of projects.  By fall there will be video offerings on the internet. Eventually the whole program will be available on-line.

More later.

 

 

Next (small) Step Towards Humans: Assembly Language

Sorry, this was supposed to be published last year after the LMC posts.

Even with the simplified LRC computer, programming with just the number commands gets tedious very quickly.  Sure, you start to recognize 901 as input, 360 as store what is in the A register (actually a copy) into memory location 60, etc.

However, if the program is more than a few steps, it is very hard to “see” the logic.  The worst is that if you leave out a step and put it in later, you must move all of the following code down a step and change all of the branch references.  No fun.  The whole process is very mistake prone — even for super-stars.

The first step in humanizing is to write the program using words and symbolic references.  INP replaces 901, STA 60 replaces 360 and even better, STA TEMP means store what’s in the A register in a location named TEMP, that is one of the 100 locations that the computer will know TEMP refers to. In 99.999% of the time you don’t really care where the data is stored, once the program starts working on it.  You only are concerned with the input data and the output (results) data.

Of course, the computer has no clue what IMP, STA, and other symbols  mean, so you must have a program, called an assembler, that actually reads these letters and assembles (i.e.,converts) them to proper machine language. (thus the clever name, “Assembly Language”)

Peter’s LMC simulator does this nicely.  You can see the translation from letters to numbers — and what mailboxes the assembler decides to assign the symbolic references.  Confusing?  This video will show you how it works.

Graphic Languages vs Text — Changing my Mind?

Maybe not exactly changing, but thinking things through. (Sounding like a politician?)  Whatever, here’s my latest thinking:

Background:
Graphical languages:  Think Scratch  (desk & laptops)  Hopscotch or Scratch Jr.  (tablets) — but LOTS of others.

Text Languages:   Think Python, Java, JavaScript, C, C++, C#, agin LOTS of others.

I’ll use Scratch and Python as examples, but what I say will apply to all of them.

Here’s what you usually hear.  “Little kids start with Scratch and then graduate to Python when they get older.”

A few brave souls will tell you that they have (or know of) a way to learn Python without going through the Scratch experience — implying that time is really wasted on Scratch.

(BTW, If you’ve never programmed, then some of my comments will seem either trivial/obvious or hard to understand.)

A big advantage of Scratch is that you move blocks and objects around on a screen — no typing.  True, for some, but many (most?, all?) have cell phones and can type like demons.

A Scratch disadvantage is that the screen gets full of blocks quickly, so even moderately complex programs are hard to “see”.  Python, OTOH, reads like a book — and we’re all used to reading books.

I’ve been, pretty much, in this later camp — probably because that’s how I learned and spent my programming life writing fairly complex programs.  The blocks do not help me visualize the program.  Scratch was born in 2003, when I was 67, after some 46 years of programming.  (built in bias)

Do you remember when Microsoft Windows came into being?  1985, and it was very sluggish.  Folks knew how to make windows & mice work before that, but the hardware was too slow to make it practical.  Look at where we are now in just 30-some years.

I’m biased, by my experience, but I’ll try and put some of it aside.  Almost all Python programs are linear (do things one-at-a-time).  Example: You have a pile of financial data and want a program to take those numbers, do some calculations (like add up revenues) and print or display the “answers”.  It’s called  Procedural Programming, better,  Input-Process-Output, or IPO logic paradigm.

In the 90’s another way of programming dealt with objects.  it’s called Object Oriented Programming (nice discussion, here)  or OOP.  If you Google OOP vs IPO you’ll get over 800 thousand hits.  Most of the programs written today favor the OOP paradigm.  Why? Well, most folks feel that very complex programs are easier to write, have fewer bugs, and are easier to maintain.  As the politicians, say, it’s complicated, but vote for it & they’ll then tell you why it’s better ?

Scratch has objects on the screen (called sprites) that do things.  Each sprite can run independently and can send and receive messages from other sprites and act on those messages.  Very intuitive in a graphical environment — less so with Text.

Aside: the reason this can work well is that modern computers have multiple-cores and can do different calculations at the same time.  Almost like having different physical computers working along side each other.  Tricky stuff, for sure, but think of anything that we humans do.  Lots of simultaneous actions.  I’m typing this, listening to a CD ( one of mine actually), chewing on a protein bar.  The phone rings — I stop typing and answer it, but I do not stop the CD — or throw away the protein bar.   Hard to simulate that with a linear process. (Possible if you break the different processes down into small enough pieces, but that’s very hard & tedious to do.)

Point is that we do all sorts of things simultaneously.  Interrupts change things, messages receive cause changes (take out the garbage?)  Play a computer game. You are constantly interrupted — and you are constantly interrupting the code that runs the game.  Each interruption usually causes an action change — from both you and the computer.

The other reason for using Scratch is that beginners, especially school kids don’t program every day.  It’s much easier to remember how to move the labeled blocks around to make their program.  Finding the right block is easy.  If they are using Python, they will have to remember the syntax exactly –or more realistically, look it up.  No problem if you are an everyday programmer, you have internalized all of the syntax,

For a beginner who knows that she needs a loop to execute some commands, but doesn’t remember the exact syntax, Scratch makes it easy.  Just look and find the loop block.

So, you don’t have to start with Scratch, but if you are not at it every day, it makes life a lot easier.  Thinking of the sprites as objects and all that involves is very natural — a big advantage.

 

Useless Speculation? (a Definite Maybe!)

Just saw a neat PBS NOVA program called, “Why Trains Crash”.  If you missed it you can find it on their site, here.   As usual, there was a pitch at the end bemoaning how sad the USA situation is and that the ONLY way to right things is for massive government involvement.  That argument was bolstered by noting that Japan has many more train travelers and fewer accidents — and they spend much more.  However, they did mention that here, some 30 thousand folks are killed annually by autos and only one-thousand by trains.

There’s a competing technology:  Self Driving Vehicles.  Here is an article describing the advantages (be sure and watch the short video).  In fact, there are some folks who predict that kids born today (and later) will never drive a car. Horrors, how can one become a well-adjusted adult without that DMV experience(s)?  (and those driver-ed classes)  Scary ?

I’ll hedge my bet a bit and say that “kids born after next year, will never drive a car.  How about you?  Place your bets.

So, if you were (are?) a congressperson (federal, this is a such a big problem requiring LOTS of $) what would you do?  Vote for more government spending to improve the trains?  Do nothing, or cut back funding, assuming that the private sector will solve the problem with self-driving vehicles?  The latter involves violating the “Thou shalt not spend less” government commandment, so probably won’t happen.

As for most things, the market will sort it all out, even with regulators pouring sand and molasses in the economic gears.  The self-driving-vehicle revolution is accelerating.  It will be fun to watch it all happen.  Hey, a way to finally solve the “texting while driving” problem.

 

What’s Happening? It’s Been a Year

My last post was about a year ago.  last week I noticed that about 300 folks became subscribers.  Seems strange — no posts and then a bunch of new subscribers.  Maybe someone at GoDaddy (hosts my site) got some wires crossed?  I’ll check it out.

But meanwhile, if this email is a surprise, you might want to check some of these posts out.  I just reread them all and fixed a few typos.  What I said a year ago still holds.  If anything, things are moving more quickly.  As a wise man once said, “Ride the horse in the direction it’s going!”

The latest laugher is that by 2040 (but what month?) AI (That’s Artificial Intelligence) will surpass humans.  Result: wipe out humanity!.  Like who needs folks?

What to do?  (I bet you guessed)

Create a government bureaucracy of VESP’s (Very Elite Smart People) to create laws that will keep it from happening.  Just peel off a few greenbacks (or Bitcoins) send them in and all will be well.  Not to worry, because VESP’s have a learned and practiced skill of spending other folks money for TGG (The Greater Good).  Right.  How about we form a joint venture & make some T-shirts that say, “Elect Me to Save You — from the ROBOTS”  Could be a winner.

Whatever, as my grandkids say.  The point is that AI and robotics are getting more sophisticated, can do more things, and will be able to greatly benefit us all — more jobs, greater choices, make prosthetics work better, etc.

Should the government control things?  Your call on that.  Not for me, I just think that any congressman, cabinet member, or anyone in government (either side of the aisle) has no clue about, or how to regulate such a diverse and quickly changing market.

Neve mind.  To play effectively in this world, you better know about robots, and particularly computer programming.  How much?  Tough call.  Depends.  Want to “play” an instrument or just “listen”.   Still good to know that the skinny black tube with some wood on the end is a clarinet — even if you can’t play it.

My assumption is that your kids haven’t chosen their life’s work yet, (you have, right?) so how about they learn the basics correctly so they can better figure out how they want to interface with this rapidly changing world.

I had lunch last week with a smart college senior with a dual major in physics and economics.  His big problem is, “Which of all of these neat opportunities, do I take when I get out in June?”   Basically, I told him it didn’t matter as long as it looked interesting, because in a few years the world will change and he’ll be facing the same decision again — but with more experience and maturity.  And he’ll be, what, 25 years old!

Opportunities abound.  Glass half full, or half empty?  Be a doer or a taker?

Hope you stick around for my “restart”.

I’m planning on weekly or semi weekly posts, along with some freebie course materials.  If that’s too annoying for you, I’m sure that you can “unsubscribe” from my blog.

 

 

Opportunities galore, where are the folks??

Serendipity.  I was “surfing the net” looking for better ways to teach/learn programming.  ‘Learn to code, it’s harder than you think” (here) caught my eye.  Interesting.

Summary:

  1. Special aptitude needed. Only a small percentage of folks can do it well.  It’s a high aptitude task.
  2. Most of existing programmers are self-taught.  A recent survey says 76%.
  3. Computer science majors looked down upon by other students.  “They’re geeks!”
  4. No clear, defined path to a programming career.  Software development is the only professional career where you can get free training.  Want to be a doctor?  Go to medical school for 7 years. (if you have the $ and can somehow get in.)
  5. Many (over half) of the graduates from computer science schools have insufficient programming skills to get hired.

This last point is echoed in many articles.  “Hey, I’ve had 4 years of French — but I can’t carry on a (French) conversation  — they talk so fast.”

So we have a two sided problem.

1) Not enough students study computer science. For the over half million computing job open in the US, there are only about 40 thousand computer science graduates.  That’s a ratio of 15 to 1.  Maybe the “It’s easy”, sends a message that it’s not a serious career? If most of the professionals are self-taught, why bother with a computer science degree?

2) Here’s the killer.  well over half of those graduates are poor programmers.  Why?  It’s not PC to fail students? Maybe, they are being taught the “wrong stuff”. Professionals not only have to code, but design, deal with end users, work in a team, and be able to adequately respond to their managers “panics”.

How much of all this is taught or even mentioned in schools?  Not much.

Unfortunately, many excellent academics have had little experience building software in the so-called, real world. Successful software products live a long time.  Others will be working on it for future fixes and enhancements.  Better know how to design and code so others can efficiently understand the logic and add to it.

For folks who have never been involved in software, it’s difficult to imagine the amazingly high complexity level. There are thousands, sometimes millions of lines of high level code in a typical product.  More than one person can “hold in their head”.  Elaborate procedures are needed to ensure that errors don’t creep in while adding even a simple enhancement.  It’s not for the faint in heart.

As to 1), that will be solved by the marketplace.  The huge current emphasis on getting kids interested combined with folks learning about the economic opportunity will help. Folks always gravitate to opportunity.

Robotics: Add an important concern.  If the product is a game or a simulation that just involves items on a monitor, that’s one thing, but, if the software is moving a physical robot around, it’s another.  An error might damage valuable equipment or even kill someone.

Vastly different consequences.  The need for quality procedures and standards is much more important.

So why not learn proper habits from the start?

 

Robotics: What’s Happening (better: happened)

Check out Amazon’s Robots , 19 most important in 2015 and Merry Christmas.
There are videos at the bottom of the first two links.

Amazing, but I think that we’re now just at the “Model T” stage.  The advances that we’ll see will be stunning — and will impact our lives in unimaginable ways.

Remember the maze solving crow?  Could a current autonomous robot solve it?  I doubt it.  A way to go, but soon.

Be nice if our kids (and we) get properly prepared.  It all starts with programming a computer.  Better get it right.

 

Programs: For Me, For You, For All

I just read an interesting article, A Different Approach to Coding. One of the authors, Mitchel Resnick (MIT Media Lab), is one of the inventors of Scratch, a universally used graphical programming language.  Their thesis is that coding is a new form of literacy.

BTW, Last December’s, Hour of Code, used Scratch and some other graphical approaches to get kids excited about programming.  If this is new to you, go to that site, spend a few minutes.  Just click the “Start Learning” button.  You’ll see immediately how it all works.  It’s fun.

Their programs are aimed at grade school kids.  That’s the main market. But, older kids and grown ups are not excluded and many get their first programming exposure via the graphical languages.

Besides the entertainment, all of this training falls in the “For Me” category.  By that I mean the these techniques are for programs that you write just for yourself.  Could be for fun or to solve a particular problem.  You know what data that you have and what you want the program to do.

A good example of this activity is that spread sheet that you build to sum up some numbers or to calculate something specific that you do over and over with a calculator.  (Can you still add up a column of numbers by hand?  Why is the answer different the second time through?)

You need very little documentation and in many cases will use the program for awhile and then “throw it away”.  (Well, it’s probably still there, on your computer somewhere, unless you needed the space and deleted it.)

All of the training that I’ve seen in schools and in most of the on-line offerings to teach programming, result in learning how to build “For Me” programs.

For these programs, standards are not very important. Naming conventions, documentation, and proper structuring can be ignored because no one else will see that program and even you will probably scrap it after awhile. It’s probably not that large, either.

But, suppose you ask me to build you a program to input some sales numbers from a web database, calculate the average gross profit by product and print out a bar graph showing comparative results.

We’re now into the “For You” category, and several questions immediately come to mind, like, Where is the database? How many products? Where are the cost numbers needed to calculate the gross profit? What do you really want the output to look like?

That will take some time to sort out and then I’ll figure out how to organize the program and then code it — understanding that you will be running it.  Key, is “you” not “me”.

Oops, what if you try to run and the database is unavailable?  I see a phone call coming.  So I better put in some code that checks to see if the database is there and working properly.  If not, print you a message that says, “Sorry, data base not available — try later”, or something like that.

Then, what if there is an entry in the database that’s supposed to be a number but is $%#, a “corrupted” number?

Lots of things like that come to mind, but because we are friends and you can call me, I just have to take care of the most obvious and annoying items.  You can call if something “weird” happens.

All of the “extra work” will take about three times the work to just do the calculations.  Yes, THREE!  Hard to believe, but if you are a programmer, you are nodding. (underestimate?)

Ok, It’s later and you are so happy with that program, you say, “It’s great and we should turn that into a product and get rich and famous.”  (been there, done that — not the last part))

Now we have the “For All” case.  We won’t know the users. If something goes wrong, they will be on the help line and probably mad.  Some will, that’s reality, but you want to cover as many of the errors as possible.  They come in two varieties. those inside our product and those outside.

The customer/user could care less who’s to blame, they just want it fixed — and yesterday, please!

By proper design, coding, and testing (lots!) I can minimize the internal errors in the product — but external? Very tricky.  Just have to anticipate as many as possible. You can only blame Bill Gates or Steve Jobs for so many.  (I’ve tried — it doesn’t get you far.)

Without going into detail we have another factor of THREE here.  So the original “For Me” program takes NINE times the time and money to turn it into a “For All” product.

If you have been through this software product process before, I bet you are thinking, “Only NINE times?  In my experience it’s more!”

Whatever that factor is, the initial design and programming is a small part (maybe 20%) of the lifetime resources needed.  Also, the product might need fixing or enhancing.

All of that is ignored in the myriad of available programming courses.

The 80% part is critical.  It’s there — the elephant (gorilla?) in the room.  Good methods, standards, etc. make that “doable” — meaning, with out going bankrupt.

Why not learn those from the beginning? Hey, it even makes the 20% part “smaller”.