Free Markets, Free People

Technology

The end of gun control?

I ran across an article in Forbes by Mark Gibbs, a proponent of stricter gun control, in which he thinks, given a certain technology, that gun control in reality may be dead.

That technology?  3D printers.  They’ve come a long way and, some of them are able to work in metals.  That, apparently led to an experiment:

So, can you print a gun? Yep, you can and that’s exactly what somebody with the alias “HaveBlue” did.

To be accurate, HaveBlue didn’t print an entire gun, he printed a “receiver” for an AR-15 (better known as the military’s M16) at a cost of about $30 worth of materials.

The receiver is, in effect, the framework of a gun and holds the barrel and all of the other parts in place. It’s also the part of the gun that is technically, according to US law, the actual gun and carries the serial number.

When the weapon was assembled with the printed receiver HaveBlue reported he fired 200 rounds and it operated perfectly.

Whether or not this actually happened really isn’t the point.  At some point there is no doubt it will.  There are all sorts of other things to consider when building a gun receiver (none of which Gibbs goes into), etc., but on a meta level what Gibbs is describing is much like what happened to the news industry when self-publishing (i.e. the birth of the new media) along with the internet became a realities.   The monopoly control of the flow of news enjoyed by the traditional media exploded into nothingness.  It has never been able to regain that control, and, in fact, has seen it slip even more.

Do 3D printers present the same sort of evolution as well as a threat to government control?  Given the obvious possibility, can government exert the same sort of control among the population that it can on gun manufacturers?  And these 3D printers work in ceramic too.  Certainly ceramic pistols aren’t unheard of.    Obviously these printers are going to continue to get better, bigger and work with more materials. 

That brings us to Gibb’s inevitable conclusion:

What’s particularly worrisome is that the capability to print metal and ceramic parts will appear in low end printers in the next few years making it feasible to print an entire gun and that will be when gun control becomes a totally different problem.

So what are government’s choices, given its desire to control the manufacture and possession of certain weapons?

Well, given the way it has been going for years, I’d say it isn’t about to give up control.  So?

Will there be legislation designed to limit freedom of printing? The old NRA bumper sticker “If guns are outlawed, only outlaws will have guns” will have to be changed to “If guns are outlawed, outlaws will have 3D printers.”

Something to think about.  I think we know the answer, but certainly an intriguing thought piece.  Registered printers?   Black market printers?  “Illegal printers” smuggled in to make cheap guns?

The possibilities boggle the mind.  But I pretty much agree with Gibbs – given the evolution of this technology, gun control, for all practical purposes, would appear to be dying and on the way to dying.

~McQ

Twitter: @McQandO

Can you solve the debt crisis by creating more debt?

Most intuitively know you can’t borrow your way out of debt, so it seems like a silly question on its face.  But the theory is that government spending creates a simulative effect that gets the economy going and pays back the deficit spending in increased tax revenues.  $14 trillion of debt argues strongly that the second part of that equation has never worked.

The current administration and any number of economists still believe that’s the answer to the debt crisis now and argue that deficit spending will indeed get us out of the economic doldrums we’re in.  William Gross at PIMCO tells you why that’s not going to work:

Structural growth problems in developed economies cannot be solved by a magic penny or a magic trillion dollar bill, for that matter. If (1) globalization is precluding the hiring of domestic labor due to cheaper alternatives in developing countries, then rock-bottom yields can do little to change the minds of corporate decision makers. If (2) technological innovation is destroying retail book and record stores, as well as theaters and retail shopping centers nationwide due to online retailers, then what do low cap rates matter to Macy’s or Walmart in terms of future store expansion? If (3) U.S. and Euroland boomers are beginning to retire or at least plan more seriously for retirement, why will lower interest rates cause them to spend more? As a matter of fact, savers will have to save more just to replicate their expected retirement income from bank CDs or Treasuries that used to yield 5% and now offer something close to nothing.

My original question – “Can you solve a debt crisis by creating more debt?” – must continue to be answered in the negative, because that debt – low yielding as it is – is not creating growth. Instead, we are seeing: minimal job creation, historically low investment, consumption turning into savings and GDP growth at less than New Normal levels.

Not good news, but certainly the reality of the situation.  Deficit spending has been the panacea that has been attempted by government whenever there has been an economic downturn.  Some will argue it has been effective in the past and some will argue otherwise.   But if you read through the 3 points Gross makes, even if you are a believer in deficit spending in times of economic downturn, you have to realize that there are other reasons – important reasons – that argue such intervention will be both expensive and basically useless.

We are in the middle of a global economy resetting itself.  Technology is one of the major drivers and its expansion is tearing apart traditional institutions in the favor of new ones that unfortunately don’t depend as heavily on workers.

Much of the public assumes we’ll return to the Old Normal.  But one has to wonder, as Gross points out, whether we’re not going to stay at the New Normal for quite some time as economies adjust.   And while it will be a short term negative, the Boomer retirements will actually end up being a good thing in the upcoming decades as there will be fewer workers competing for fewer jobs.

But what should be clear to all, without serious adjustments and changes, the welfare state, as we know it today, is over.  Economies can’t support it anymore.   That’s what you see going on in Europe today – its death throes.   And it isn’t a pretty picture.

So?  So increased government spending isn’t the answer.  And the answer to Gross’s question, as he says, is “no”. 

The next question is how do we get that across to the administration (and party) which seems to remain convinced that spending like a drunken sailor on shore leave in Hong Kong is the key to turning the economy around and to electoral salvation?

~McQ

Twitter: @McQandO

Moving technology from making things possible to making them easy

I’m coincidentally the same age as Steve Jobs and Bill Gates. I’ve seen and worked in the industry they created – what we first called "micro-computers" and later "personal computers" or PCs.

Even that term is falling out of favor. "Laptop" is probably heard more often now, with "tablet" and "slate" moving in.

I’m wondering, though, if "slate" will actually stick. Just as "kleenex" is the word most of us use for a small tissue to wipe your nose (no matter how Kimberly-Clark feels about it), I wonder if we’ll someday be talking about "ipads" from Amazon and Samsung. That would merely be continuing the trend where "ipod" is becoming the generic term for an MP3 player.

This is one example of the power of Steve Jobs to set the agenda in the last ten years. There are plenty more.

The changing signs on Music Row in Nashville are another testament to his ability to turn an existing order upside down. The iPod changed the music industry beyond recognition, and here in Nashville we had a front-row seat to watch the changes.

The area of most interest to me, though, is in software. I’ve focused more on user interface design over the years than any other area. I’ve watched Apple drive a trend that is powerful and desirable in our industry: moving from just making something possible with technology to making it easy.

For decades, it was enough for a software program to make something possible that was not possible before. DOS-based software was never particularly easy to use. The underlying technology to make it easy just wasn’t there.

Jobs and Wozniak pioneered that era, but Bill Gates ruled it. He reduced IBM to irrelevance, along with Novell, Lotus, and WordPerfect, all major league software companies at one time.

To some extent, Bill understood the importance of making things easy; Excel was about ten times easier to use than Lotus 1 2 3. But he never really innovated much in making things easy. His forte was seeing good ideas produced by others and then copying those ideas and making products based on them affordable and practical. Windows was never the equal of the Mac until (arguably) Windows 7, but it ran on cheaper machines and Bill made it friendly to businesses, which were the biggest buyers of PCs until somewhere in the 1990s.

Steve Jobs and his crew were Bill’s best idea source. I sometimes thought that they served as the unofficial research arm of Microsoft for user interface design throughout the eighties and nineties. Apple sputtered through that period, producing hits (iMac) and misses (Newton). At one point, Bill Gates even stepped in with a capital infusion that saved Apple from likely irrelevance or even bankruptcy. I suppose he didn’t want to see his free research lab disappear.

During that era, Steve Jobs kept pushing the boundaries. The very first Mac was a pain to use, because it was too slow to do what he imagined, and had a screen that we would laugh at today. But it made some new things possible, such as real graphic editing. Though a PC was my main machine in the mid-1980s, I would put up with the Mac’s flaws to do my graphics work. The salesmen at our company often said our diagrams of the system we were proposing often clinched the sale.

I believe Jobs had a vision during that period of what personal technology could be like, but the nuts and bolts were not quite there. Nevertheless, he always insisted on "user first" thinking.

Jobs understood something that is still misunderstood by almost all companies in technology. You can’t innovate by asking your users to tell you what to do.

The typical technology company convenes focus groups and does market research, and then says "Ah, what buyers want is X, Y, and Z. OK, you lab guys, go create it for the lowest possible cost."

Steve Jobs understood that consumers and users of technology don’t know how to design technology products any more than movie goers know how to write screenplays. To create innovative and delightful user experiences, it is necessary to get inside the mind of the user and understand them so well that you know what they will like even before they do.

This is hard. It’s so hard that only two companies in my lifetime have been any good at it at all: Apple and Sony. And these companies have dramatically different batting averages, with Apple up in Ted Williams territory while Sony languishes around the Mendoza line.

Finally, about ten years ago, the underlying technology started matching up with Jobs’ vision. The result was the iPod.

There were plenty of MP3 players that pre-dated the iPod. I had one, from Creative. It had about enough storage for three albums, and required me to organize files and folders on it to store my music.

Steve Jobs saw the small, low power hard disks coming on line and realized they could be the foundation of a new, reimagined device. First, it would store hundreds of albums or thousands of songs – a typical person’s entire music collection. It would use software designed earlier to manage music – iTunes.

The big departure was the approach to user experience. The iPod was so simple to use that someone could pick it up and figure it out in about two minutes.

This was done by purposely leaving out features that were arguably useful. While the other MP3 makers were designing and marketing on checklists of features, the iPod stripped things down to the basics. And kicked the others to the curb.

Jobs realized before others that it was time to stop working on "possible" and start emphasizing "easy". When technology is new and rapidly evolving, something new is possible with each passing year, and giving buyers new features is enough to sell products. But when technology reaches a certain point, and the feature lists get long enough, all products have the essential features. The differentiation then becomes based on something very simple: what people like.

This is particularly true as technology starts appealing to a broad market. If you try to satisfy everyone in a broad market by including all the features anyone in a broad spectrum wants, you’ll end up with an unusable mess.

At some point in the evolution of technology for a given space, people just assume that the features they really need will be in all the devices they see. They start choosing based on emotion. That is, they seek what feels elegant and fluid to them, something they really want to be a part of their daily life.

This is where genuine design, based on universal design principles that go back decades or centuries, starts adding value. For example, Hick’s Law says that the time required to choose an option goes up as the number of options increases. Simply put, users get frustrated trying to find the feature they want from a long list of features in a menu, or trying to find the button they want on a remote control that has fifty-eleven buttons.

There is an entire body of knowledge in this space, and the first major computer/software company to emphasize designers who knew and understood this body was Apple. The culture at Apple values people who know how to get inside the mind of a user and then create a new way of interacting with technology that the user will love.

Jobs created and drove that culture. He went from turning the music business upside down with the iPod to turning the phone industry upside down with the iPhone, and now Apple is remaking their original territory, the personal computer, with the iPad.

I’ve discussed before in the comments here that I don’t like the iPad. It’s slow and limited for my purposes, many of the web sites I use are not compatible with it, and I don’t like iTunes.

But it’s not designed for me. That’s a key lesson that designers grow to appreciate. Each design has a target audience, which must not be too broad. The true test of a good designer is whether they can design something for someone who is not like them. 

I put my iPad in the hands of my 76 year old mother, and she immediately took to it. I showed her a few basic touch gestures, and she could immediately do the only things she uses a computer for – browsing and email. For her, it was easy, and as a veteran of the made-to-do-anything-and-everything Windows (I got her a computer for email and such six years ago), she really appreciated that.

The culture created by Jobs can do things that Microsoft, for all its money and brains, is not very good at. Microsoft people are smart. I work with many of them, so I’ve seen it firsthand. But almost all of them have a tendency that is all too common in the human race. They can only see the world through their own eyes, and are not very good at seeing it through the eyes of someone with a radically different background or different abilities.

When Microsoft teams start designing a new product or version, most of the times I’ve been involved, the process started with a list of proposed features. In other words, their process starts with what they want to make possible for the user.

Unlike Apple, the culture at Microsoft places little or no value on making things easy. This isn’t surprising, because Microsoft’s success over a span of decades has not been dependent on innovation in making things easy. It’s been in making things possible and affordable. They copied the "make things easy" part from someone else, usually Apple.

But even Microsoft has seen the direction for the industry laid out by Jobs and Apple, and realized that things have sped up. Copying isn’t good enough any more. Jobs perfected the process of laying entire segments waste with an innovative new entry, and as the iPhone showed, it can happen in a single year.

Those at Microsoft are starting down the path of worrying more about user experience. They may not like it much, but they realize it’s now a matter of necessity. 

First, they created the XBox – an entirely new product in a separate division that successfully challenged established players in a world where user experience trumps everything else. Then, shamed by the abysmal Windows Mobile products they had produced in the phone space, they created a pretty decent product there in the Windows Phone.

Their steps are halting and tentative, but at least they are toddling down that path now. I hope they learn how to walk and run on that path, but given the effort it will take to turn their culture around, that will take a while.
 
I don’t know that they would have ever gone down that route if Jobs and Apple had not pushed them down it. I’ve chafed for most of my career at the apathy and ignorance in the Microsoft community around user experience. I’ve always believed that our systems and devices exist for users, not for our own aggrandizement. As such, we owe them the best experience we can give them.

I was never a major Apple customer. Apple was never a cost-effective choice for the business and data oriented software I’ve created.

But that doesn’t mean that I don’t appreciate what Steve Jobs did for our industry. I absolutely do. I wish he could have been around for another decade or two, continuing to show the world that "possible" isn’t good enough, and push the rest of the industry into respecting our users and making things easy.


There are certainly more comprehensive tributes, but this is my favorite so far. From Steven Horwitz:

Unlike many, I am not an Apple-phile. I honestly don’t get the emotional relationship people have with their products. HOWEVER… there is absolutely no doubt that Steve Jobs is a symbol of all that is right with markets and capitalism. This is a man who became very, very rich by making many people’s lives (including my own) very much better. He was a master at creating value and persuading people that they wanted things they didn’t know they wanted. He should be part of the pantheon of human heroes.

Unlike the political and military heroes of war we too often celebrate, Jobs is a hero of peace. He made his money through persuasion not at the point of a gun, and through mutual benefit not oppression and exploitation. Those of us who really desire a peaceful society should not celebrate those who were victorious in war, but those who created value through peaceful, voluntary, mutually beneficial exchange – exchanges that happen billions of times every single day. And we should do it no matter whether what was exchanged was electronic bits of magic, food for us to eat, or financial instruments that improve the movement of capital. They all create value and improve our lives, and all of their benefits are deserved.

Thanks for everything Steve and thanks for making the world a better place one peaceful, cooperative exchange at a time.

For the software devs among our readers, a report from //build/

Microsoft’s //build/ conference is on, where they are rolling out plans for a pretty dramatic shift in Windows for the next generation.

I’m in sunny Anaheim at the conference, with no time to pen a long post. If you’ve got ten minutes to waste listening to me ramble, and you care about the Microsoft side of the tech industry, you can watch this video which was posted a couple of hours ago. Actually, it might be better to watch some other videos in the series that feature Microsoft executives with a lot more interesting and detailed things to say, but, hey, if you make fun of them in the comments here, they’ll never see it. Whereas you can point out that the camera angle makes me look like I have some kind of weird arthritis, and I just have to take it.

If you don’t care about software development, or do care but are apathetic or hostile to Microsoft, my apologies. Please return to our usual program of economic and political doom.

The shape of things to come

Michael Wade sent me and email a few days ago asking me what purpose I thought banks served any more. That email led to a phone conversation, and that conversation led to this post. Because that simple question opened up a whole area of investigation about not just banking, but a whole universe of institutions that may be near the end of their purpose.

The modern era has been an age of institutions. Banks, unions, governments, corporations—a whole panoply of organizations whose sole purpose was to provide a central clearinghouse for goods and services, and the regulatory rules and legal framework under which they operated. But we are now seeing glimpses of a future in which institutions simply have no purpose, or, at the very least, will serve a different purpose than they do now. The era of institutions is passing, and is being replaced by the era of…something else. I think—I hope—it will be the era of the individual.

Let’s take the example of banks, first. Currently, banks take deposits from their customers, then loan those deposits out—less a reserve requirement—for mortgages, revolving credit, business loans, etc. They also offer their customers the convenience of access to their money on a moment’s notice, almost anywhere in the civilized world. 

Now, imagine a world where your money is stored on a personal biometric device. So, you no longer need an institution to store your money.  You can carry it with you—perhaps implanted in you—everywhere you go. Your entire stock of cash and savings are now truly yours, and in your personal possession at all times. So what happens to banks? Without depositors, there are no longer any deposits to loan out in credit cards or home purchases. What happens to banks, then? More importantly, what happens to credit then? Perhaps banks will have to change from depository institutions to investor-funded lenders. Or be replaced by them, as there are already web sites where potential creditors and debtors can engage in micro-lending.

We are on the cusp of really transformative technological change, and if you want to see what the implications are for institutions, you need look no further than the music industry, where the RIAA is in a fierce rear-guard battle to maintain their viability. The entire music industry is being destroyed, as an institution, by the new digital technologies that were created just a decade ago. It may be a shock for some of you younger readers, but there was, at one time in the recent past, a world in which there were record stores in every shopping center and mall.

It used to be that the recording industry controlled every aspect of commercial music.  They would underwrite the recording costs, would create the playable media and packaging, then pay for the distribution to music stores. If you wanted a piece of that pie, and hit it big in the music world, you had to scrape up enough money to make a demo tape, send it into Sony, BMG, RSO, etc., and hope that some executive was impressed enough to sign you to a contract to make your first album.

The world doesn’t work that way any more.  For less than $1000, you can turn your dingy studio apartment into a multi-track recording studio. You can get a web site, upload your MP3 files onto it, and sell them online. You don’t need a record company, a distribution channel, or marketing money. This is killing the record industry. The RIAA is actually trying to extort royalty money from bar owners who have live bands play, on the theory that they should get a piece of the bar’s profits from the music performance.  Good luck with that.

Digital publishing is starting to do the same thing to the publishing industry, as Amazon is making it possible for anyone to publish their book. Yes, a lot of less than stellar talents are publishing for the Kindle now, but some mainstream writers are now moving over to the Kindle platform. Publishing, as an institution, is in trouble.

Technology is now empowering individuals in ways that were undreamed of 20 years ago, and the pace of that change, and the vistas it’s opening up for individual empowerment is increasing every day.

Obviously, institutions, including governments, are going to become increasingly leery of this trend. After all, it is not in the best interests of institutions to allow individuals to be empowered. So there will be some sort of backlash at some point. Hopefully, that backlash will be as ineffective as the RIAA’s backlash against digital music has been. But some institutions have their own police and armies, and they have the potential to resist more strongly.

Of course, since we are now in the middle of what appears to be a huge test of government’s ability to manage the economy and currency—and government is not doing a very good job of demonstrating competence—maybe even that potential problem can be minimized.

We can only hope.

London rioting–are we seeing the death throes of the welfare state?

Buried deep in the New York Times story about the ongoing riots in London, the inability of the police to contain them and the fact that they’ve now spread to other cities is this paragraph:

For a society already under severe economic strain, the rioting raised new questions about the political sustainability of the Cameron government’s spending cuts, particularly the deep cutbacks in social programs. These have hit the country’s poor especially hard, including large numbers of the minority youths who have been at the forefront of the unrest.

The underlying cause of the riots had to do with the shooting, by police, of a popular activist in London.  The spread, however, is presumably now because of the “spending cuts” the Cameron government has made in an effort to address it’s very serious deficit problem.   This on the heels of the same sort of unrest and rioting in Greece when social programs were cut.

The paragraph is intriguing because of the way it approaches the problem.   It doesn’t stress the debt or deficit the UK has or the fact that the level of spending the UK is committed too in order to fund the social programs is unsustainable, it instead addresses the “political sustainability” of such cuts.

That’s a very telling point.   Substitute “political will” for “political sustainability” and you get the picture.  And frankly, that’s what it boils down too everywhere.   Do the politicians in charge actually have the political will to do what must to be done to right the financial ship of state?

What has been built by the welfare states everywhere is crumbling.  There are large irreparable cracks in their foundations.  All are showing signs of unsustainability and that is leading to internal instability.  The recipients of the largess taxed from the producers and borrowed on their behalf isn’t going to be there much longer. 

That’s the problem.  Even the rioters know that the gravy train, in relative terms, is pretty much over.  Reality, not politicians, have said so.  In fact the politicians mostly have no choice – they either have the means to continue as they have in the past or hey don’t.   And the more severely indebted welfare states are hitting that wall.

Add this to the mix though and you see how very horrific this is for the UK:

Beyond such social challenges is the crisis enveloping London’s Metropolitan Police. Even before the outbreak of violence, the police have been deeply demoralized by the government’s plan to cut about 9,000 of about 35,000 officers and by allegations that it badly mishandled protests against the government’s austerity program last winter and failed to properly investigate the phone-hacking scandal that has dominated the headlines here for much of the summer. The force now faces widespread allegations that it failed to act quickly and forcefully enough to quell the rioting at its outset over the weekend.

And of course, citizens there are left not only to fend for themselves in many cases, but have been disarmed by government to boot.

As for the poor “disadvantaged youth” at the center of the rioting?  Well it seems they may not be quite as poor or disadvantaged as one would think:

Despite a build-up in the number of riot police officers, many of them rushed to London from areas around the country, gangs of hooded young people appeared to be outmaneuvering the police for the third successive night. Communicating via BlackBerry instant-message technology that the police have struggled to monitor, as well as by social networking sites like Facebook and Twitter, they repeatedly signaled fresh target areas to those caught up in the mayhem.

They coupled their grasp of digital technology with the ability to race through London’s clogged traffic on bicycles and mopeds, creating what amounted to flying squads that switched from one scene to another in the London districts of Hackney, Lewisham, Clapham, Peckham, Croydon, Woolwich and Enfield, among others — and even, late on Monday night, at least minor outbreaks in the mainly upscale neighborhood of Notting Hill and parts of Camden.

They’ve used technology to organize flash mobs of looters.   It’s anarchy and the police seemingly aren’t up to the job of stopping it.

The BBC and other British news organizations reported Tuesday that the police may be permitted to use rubber bullets for the first time as part of the government’s strengthened response to any resumption of the mayhem. David Lammy, Britain’s intellectual-property minister, also called for a suspension of Blackberry’s encrypted instant message service. Many rioters, exploiting that service, had been able to organize mobs and outmaneuver the police, who were ill-equipped to monitor it.

Rubber bullets, of course, only have an effect if police are where the rioters are.   And apparently, that’s not something they’ve been particularly successful in doing here lately.

Finally, harkening back to the fact that the UK has a serious debt and deficit problem and must cut spending, one has to wonder why it is involved spending money on things like this:

On Tuesday, the violence seemed to be having a ripple effect beyond its immediate focal points: news reports spoke of a dramatic upsurge in household burglaries; sports authorities said at least two major soccer matches in London — including an international fixture between England and the Netherlands — had been postponed because the police could not spare officers to guarantee crowd safety. The postponements offered a dramatic reminder of the pressures on Mr. Cameron and his colleagues to guarantee a peaceful environment for the 2012 Summer Olympic Games.

That $15 billion extravaganza will have its centerpiece in a sprawling vista of new stadiums and an athletes’ village that lie only miles from the neighborhoods where much of the violence in the last three days has taken place.

Bread and circuses?  The UK is laying off policemen and cutting defense spending, but has $15 bil to throw at the 2012 Summer Olympic Games?  One has to wonder about priorities.

All-in-all a very volatile situation which could, given the method being used by the criminals, get worse.  In the meantime expect the liberals on both sides of the Atlantic to denounce the cut backs in social spending and demand the rioting “youths” be placated.  Political will is a scarce commodity in this world.   It may indeed end up the the “political sustainability” of the cuts fall before the desire of politicians to maintain power.   Of course that won’t change the fact that the unsustainable spending bill will come due whether they or the rioters like it or not.   But perhaps, just perhaps, they can kick the can down the road just enough for them to escape the wrath and blame that will come when that can can’t be kicked anywhere any longer.

~McQ

Twitter: @McQandO

And now for something completely different–PC’s and embryos

A couple of topics of interest.  Reuters carries a story entitled “Aging PC giants see writing on the wall”.   Seems funny to call the personal computer industry an “aging” industry, but I think the thrust of the article is right – at least regarding the “desktop” computer:

Silicon Valley’s old guard is waking up to the fact that the era of consumer PC may be in its twilight, accelerating the need to invest and adapt to rapidly changing tastes.

This week’s earnings from the giants of technology had one thing in common: they underscored yet again how consumers are increasingly shunning desktop PCs and going mobile.

Intel, which had argued that pessimistic expectations about the market were out of whack, reduced its 2011 PC forecast. Microsoft Windows sales, that reliable indicator of PC market strength, fell short of expectations for the third straight quarter.

And Apple Inc, which single-handedly showed with its iPad that many consumers are more than happy with an unladen, light and mobile computer, obliterated all estimates by selling a whopping 9 million tablets.

"The desktop, at least for consumers, probably doesn’t have a great future, and the iPad and similar tablets can deliver a lot of the functionality of a laptop," said Tim Ghriskey, chief investment officer of Solaris Asset Management.

Using only my own experience as a guide, I rarely use my desktop computer anymore.  In fact, I think of it as a legacy computer.   Just about everything I do now is on a laptop.   As for the iPad, I use it extensively as well, but not primarily.  In the type work I do, to include blogging, it is more of a supplementary tool.  But I can see that could easily change.   Given the paucity of good apps for blogging that presently exist – especially Word Press -  I’m on the laptop instead.   However, should that change, the iPad could easily become dominant (especially with the bluetooth keyboard).

On the business side of things, I can see the desktop being around for a while longer.   However, again, my experience working for a company in the field had me only operating off of laptops.   I could see beefed up tablets taking that bit of the market – i.e. that part of the business market that relies on laptops.  So yeah, I’d say the “aging giants” are right.  The desktop is likely headed for the museum.  Laptops probably have a longer (leaner and lighter) future.  At some point, I imagine the tablet and laptop will merge and dominate.

Topic two, from the UK:

Scientists have created more than 150 human-animal hybrid embryos in British laboratories.

The hybrids have been produced secretively over the past three years by researchers looking into possible cures for a wide range of diseases.

The revelation comes just a day after a committee of scientists warned of a nightmare ‘Planet of the Apes’ scenario in which work on human-animal creations goes too far.

This is a plot right out of a bad mad scientist SciFi movie.  The question of course is “why”?

That question was asked by this committee of scientists and the answer was apparently less than satisfying:

Last night he said: ‘I argued in Parliament against the creation of human- animal hybrids as a matter of principle. None of the scientists who appeared before us could give us any justification in terms of treatment.

‘Ethically it can never be justifiable – it discredits us as a country. It is dabbling in the grotesque.

‘At every stage the justification from scientists has been: if only you allow us to do this, we will find cures for every illness known to mankind. This is emotional blackmail.

But:

‘Of the 80 treatments and cures which have come about from stem cells, all have come from adult stem cells – not embryonic ones.
‘On moral and ethical grounds this fails; and on scientific and medical ones too.’

And:

All have now stopped creating hybrid embryos due to a lack of funding, but scientists believe that there will be more such work in the future.

To recap – they promise wondrous cures in an area where none have been produced and the marketplace has obviously turned its nose up on the effort of producing embryonic stem cells because funding has dried up one suspects to be placed in the area where there is promise and that’s adult stem cells.   So there’s no apparent market or reason to make embryonic hybrids.

Much discussion in the article about the “ethics” of the effort.  Is it indeed “dabbling in the grotesque”?  Is it “never … justifiable?” 

Your thoughts.

~McQ

Twitter: @McQandO

Sorry about the lack of posting

But I’m getting used to a new toy – and I usually do that through total emersion. I’ve gotten my first Apple product, a gift for my recent birthday from my wife. She is a geek at heart and what this means is she wants one too.

Anyway, the product is an Ipad. I bought a wireless keyboard to go with it (I and virtual keyboards don’t get along – even ones with pretty big keys like the Ipad’s).

To say I’m in love would be an understatement. What a marvelous product. And the more I learn the more I recognize its power.

Look, I’ve been around computers since the ’80s. I remember buying my first computer with a hard drive – 10 whole megabytes – and wondering what I’d do with all that storage space.

I’m sure I’ll find things I don’t like or aren’t particuarly well done, but I haven’t foiund them yet. This is an elegant, well thought out product that has almost unlimited potential in all sorts of areas. As for applications – well, you could probably spend your waking hours pouring through all the apps for this thing and still have years to go to get through them all.

Anyway, I hope to be back to semi-normal by tomorrow. This is being done on the WP app for Ipad (I got the 16gig WiFi model).

The Ipad is one of those products that I feel, within a few months, will have me wondering how I lived without. Until tomorrow, you’ll have to excuse me while I play and enjoy. See you then.

~McQ

Twitter: McQandO