Just made a business-oriented posting over on Digital Katana (my personal business site). If you have looked at the Comcastic “Listen To A Comcast Rep Torture Customers Trying To Cancel over at Huffington Post (http://www.huffingtonpost.com/2014/07/14/the-comcast-call-from-hell_n_5586476.html) this is a commentary on Rob Enderle’s analysis at ComputerWorld
Category Archives: Tech
Is it wrong that I don’t get a good feeling when Google buys?
Google just bought Nest. Great for Nest and the team there. But now I’m wary of yet another product.
Let me be clear. Google was and really still is the de facto search engine with great results. But the corporation has a knack of both pissing me off and unsettling me every time they do anything or the walking pit of arrogance Eric Schmidt opens his yap.
They bought Waze. All fine really, if it’s independent and my location data isn’t shared with Google. It is a brilliant service, and developed by some really talented people. But I don’t want my information getting connected with a bunch of marketing or the US government or whomever else Schmidt and Google decide they want to sell it to.
I still use Waze, but not as often. It just doesn’t have that feeling of goodness and confidence because now Google is behind it. They always seem to feel entitled to any data and anything they can discover, and that they can sell or market it how they see fit. Europe has given them a few slaps, and the US and Canada the odd one as well with their wi-fi trolling via their StreetView cars. But they seem to just keep going these sorts of trust-betraying things.
So now the thermostat and smoke detectors are in the Googleplex. The system connects with a central server to track data and remotely control your home systems. It’s really good, convenient, energy efficient and part of where smart homes should have been going a long time ago.
But I just don’t trust Google anymore with data. After Schmidt infamously stated “Just change your name at 18” for people that wanted to leave their Internet history behind (Like you couldn’t also track that bit of info down if it happened) and was completely insensitive to the entire idea of personal privacy, added to the book digitization, added to the echo-chamber we-will-show-you-what-WE-think-you-want-to-see-too-bad-if-you-want-the-old-pure-algorithm modification of things, and unknown hundreds of other behind the scene shaping of our digital experience, I’m just getting fed up with this company. They need to earn the trust of their customers back. The issue is we aren’t their customers. We are their product. Many writers far more talented and diligent than I have expounded on this at length. I was a customer of Nest, not a product. I was somewhat of a product of Waze as it was free and ad supported, but it wasn’t wired in with the rest of Google initially.
Now I’m a product for all these things I purchased. Somehow it seems wrong. I want to be able to unplug from the Nest site and host the interconnect and data somewhere else now. I don’t like the feeling at all. I don’t like the implied control being in Google’s hands, in the hands of a company that regularly EOLs products after they destroy the ecosystem (Google Reader being the most recent), or after they are relied on my people or businesses.
So Google, I’m asking. Why should I trust you with all this information of MINE?
Hashtags Considered #Harmful
Hashtags Considered #Harmful: “Hashtags Considered #Harmful”
I find this kind of like Google-justification or Facebook-justification for tailored search bubbles. That gives the “most relevant and popular” results due to the way the cloud operator (twitter in this case) chose to implement the hashtag search. That doesn’t mean it’s right or wrong, but you may choose to disagree with the way your results are being “curated”.
I’ve used hashtags for search and publication in some cases and they are most useful where frequency is not so high as the outlier of the Superbowl, where community is scattered and not in the “elite” of the twitter verse and most importantly where the context of the tweet cannot necessarily be inferred efficiently without costing too much of the relevant content to be lost in the 127 characters.
Twitter devolves often into popular news and culture because it requires a shared context for the imparting of sufficient information in 127 characters to be of interest. I also hypothesize this is why app.net has so many rather direct and long-running conversations between pairs and small groups of people, is to have a built-up context that while value can be derived thanks to conversational threading of the newer service, itself requires effort and a lack of “cleanliness” of the baseline micro-broadcast that the author at the New York Times seems to disdain.
The point isn’t totally invalid although I feel the presentation as with so much modern journalism, throws away, ironically, a lot of context.
(Via Daring Fireball.)
What happens when you don’t get it….
Well, the blogroll pulled up an interesting article on the Zune, and the iPod. The author, Mike Elgan, writes in some length around how Zune might take on iPod by becoming the anti-iPod. His article for your reference and enjoyment is at: Zune: So you want to be an iPod killer“
I would venture Mr. Elgan has a few interesting ideas, but pointing to sites like iPod Hacks as the basis that iPod users want open and complex devices with great extensibility and customization is a bit of wanton leap past the region of statistical extrapolation. iPod Hacks is a cool site, and it’s in the best vein of the hacker ethic of “What can I do with this device?”, but it’s not like the millions of iPod users are in any way represented by the small base of users (proportionally) that frequent and utilize iPod Hacks information and software. I’ve kept an eye on iPod Hacks myself, as from my hacker perspective, it’s cool what people can do with this. But would I ratchet my iPod into that and lose the seamless, best music player functionality and integration I enjoy? No chance.
Mr. Elgan makes a number of very fundamental, and geek-mindset mistakes in his article. He accurately describes Microsoft’s company strengths, and extensively goes into the abysmal customer experience that the Zune is when removed from the packing. Then he loses his grip on consumer products and launches into what he wants for a music player.
Dismissing that user experience out of hand is simply foolish, especially in a market entry. Every one of the customers that goes through that pain is more likely to diss the Zune and go iPod unless they are either anti-Apple or blindly pro-Microsoft. They were trying to buy a music player, and they got a box of pain. Waving the hand of providence and MSFT-get-v3.0-right is just naive. That experience is why Apple is rolling downhill like a consumer-rampaging avalanche of revenue in the music player business. They built a device that absolutely excels at being a music player. That’s it. Now it does small video as well, but that’s not how it got to dominate the market.
Is Apple paying attention to it? You bet. And they will compete with it fiercely, and it will likely benefit the marketplace as various pressures and pricing comes to bear based on the acceptance of the varying offers. I don’t think we’re going to see iPod price drops thanks to Zune based on the current offering though.
This touches on the bigger fallacy that Mr. Elgan puts forward that simply isn’t true except in geeks. I quote:
History shows that the functionality of stand-alone gadgets always gets folded into multipurpose devices. Apple’s instinct to maximize elegance at the expense of extensibility made them No. 1 in the media player market, but the future belongs to customizable, multifunction players.
I’m afraid I can’t come up with anything that actually supports this “historic” assertion. Smart phones are eclectic and in no way make up the majority of the devices people have. I know even some tech-heads that are tossing the Palm/iPaq family of gadgets for paper and pencil, and going with a more elegant, simple phone that works better as a phone such as the RAZR or some of the Sony/Nokia offerings. Last time I checked, the mass-market still has watches, and those multi-function wildly customizable digital monstrosities of the 80s died a deserved death. Convergence only works when the result isn’t a compromise. When the BlackBerry got the phone part right in addition to mobile email, without trying to edit your excel spreadsheet on a 2″ screen, that succeeded. The first few BlackBerries that had just the office functionality with mobile email were just mobile email devices, because to the majority of users, the office app functionality was too big of a compromise.
The point of market volume as an OEM is a side argument without merit or relevance. To Mr. Elgan’s point, I know lots of people that refer to a BlackBerry, but nobody that refers to a “Windows Smart Phone”. The Windows software is customized for each phone. The fact that Java is on more cell phones than the Windows system is just as irrelevant. There is no buzz around Java on a cell phone, and there is no buzz around Windows Smart Phones. there’s a heck of a lot around BlackBerry though.
What I can’t understand is why this all seems OK to people, regarding the Zune hitting the market so very poorly. It’s a panic release obviously, and shows that the consumer experience, and more importantly, the Microsoft Brand, doesn’t seem to matter very much. Microsoft, who attached themselves to a brand of “Plays for Sure” and then walked away from it, have lost brand value in the parent name as well as killing the goodwill the Plays for Sure initiative had. Branding does matter to the consumer, and to market success. Zune with this sort of offering is detracting from the Microsoft brand. Apple delays when something isn’t ready for the market or when it’s not polished enough. Microsoft releases it and tries to fix it in a later release, and causes a pile of grief to the customers in the mean time.
I’d like to leave with one last question to him. What about all those other devices that were more open, extensible, more functional, and cheaper? Those things like the Creative Zen, also widely hacked and customized, and with more features, more product lines and configurations, and cheaper to boot? Microsoft ditched them for their own, non-compatible solution. I’m sorry, but based on the way the Zune is entering the marketplace and what it’s looking like and feeling as an initial customer experience, the head of the Zune initiative should be looking for a new job. The number of mistakes that were made, especially that irritate the customer, makes this an abysmal failure. The XBox 360 was a light year beyond this in out-of-the-box experience.
Canada had it – lost it – had it again – lost it again….
Tim O’Reilly put up an interesting piece on cell phone carriers, and what he and others on the O’Reilly staff really want from a carrier. It’s over at Ten Things I Want From My Phone: “
The big thing is good services with fair pricing. Notable rants that apply to us in Canada as well include the locked, debilitated cell phones from carriers (not to mention the North America-only CDMA nonsense. (The majority of the planet is on GSM. Plan the transition.) The irony is that twice we’ve had this wonderful situation of having a “JetBlue” of Cell Service as they put it, only to have it bought up by incumbents, and crippled to various degrees.
ClearNet was an absolutely outstanding cell provider in Canada. Cheap entry, good phones, simple plans, and quite a la carte in model. It was CDMA, but that was before the GSM standard really took off. I was a long time happy ClearNet customer. Then, they were bought by Telus. Within 4 months, there was a noticeable change in service and pricing plans, and an emergence of nickel-and-diming mentality. Thus endeth the ClearNet enlightenment. The only thing that survived was the branding and ad agency that was adopted, quite wisely, by Telus. Now even those ads are losing their elegance and getting… “busy”.
So after looking around, and deciding that GSM was the way to go so I don’t need to rent a cell phone when I do wind up on the odd business trip to Europe, I came upon Fido. Again, a great upstart, GSM, clear plans, excellent metropolitan service (the downside being only metropolitan service, but that was just fine for me, and they were new to the game) and again, a great relationship with the customer. That was an enjoyable few years as well, and then Rogers came in and bought them. Now to Rogers’ credit, they haven’t killed them off, they are more a subsidiary, but I still have notices a number of new “nickel-and-dime” bits seeping into the bill starting shortly after the acquisition, and the service was not as good as before, so I needed to add $5 a month to also enable the use of Rogers “extended area” cell footprint. Since that got me access outside of the metro areas, I was ok with that, but still a bit annoying. Essentially the company got an advantage from the purchase of Fido, whereas the customers essentially got nothing, or in some areas slightly less than before.
Now, I’m not against company purchases and it makes good business sense, but it seems there simply are not enough competitors in this marketplace to settle it out. In Europe, you get charges only for calls you make. Incoming calls are only charged if they are roaming calls and you’re on a North American carrier. But the model is caller pays. That makes a great deal of sense, and encourages cell phone use. In Canada, and the US, we get double-tapped by the carriers. Charged on both sides for both parts of the conversation. To say nothing of downright ridiculous data plans.
So as a thought, what if, in this municipal wifi flurry of excitement, we take a bit of public infrastructure and enable the marketplace? If the wifi towers go up in a city, and providing the antenna physics and such work out, allow junior carriers to rent space on the towers at a reasonable cost. The wifi towers should have fiber access to them at any rate to enable the data busses on to the Internet and back to the carrier or exchange point, so adding a cell station (GSM please) as well isn’t a huge capital outlay. The switches aren’t cheap, but it beats plowing cable and negotiating the tower placement. This would enable small cell companies to get rolling, even on a per-city basis, and get some innovation and competition in. Plus they aren’t the huge takeover targets as they don’t own as much of the infrastructure, and perhaps those towers are limited to low-market cap companies or otherwise don’t allow the density per cell some of the big players need. They are enablers, but gradually companies would outgrow them.
Much like the cable and telephone cables that are plowed into our neighbourhoods and houses, the airwaves are in many ways a public infrastructure. They are limited in supply, and are on public land with granted right-of-ways for the service they provide. Lock-in on a public infrastructure is just anti-competitive. The cost of that plant has been paid for dozens of times over by the customers. Otherwise Telus and others wouldn’t be heading to be an income trust. If you’re not making a healthy profit, an income trust isn’t a good idea.
I want the second coming of ClearNet, but maybe done by Orange or T-Mobile from the UK or someone in the leading countries of cellular service, not the backwaters of North America. What would you like to see in Canada for cell service?
Currently playing in iTunes: Razzle Dazzle by Richard Gere & Cast
THIS is what an “iPod killer” looks like?
It’s painfully obvious the bulk of the media has no clue how to deal with quality over features and performance. Which is why every single feature-ridden, cheaper-than-dirt, bling-bling outfitted competitor to the iPod has been labelled an iPod killer…. by everyone except the people buying them.
Witness the “try anything” approach of Microsoft in that they released the “Plays for Sure” moniker two years ago, which of course did basically nothing at all for the music players outside of the Apple brand, and ensured that they would seek to cannibalize each other as there was no flow-through revenue ecosystem like Apple has with iTMS, iTunes and the iPod. But hey, “Plays for Sure” should still stand for something if Microsoft really cares about the consumers.
Judging from this I would say that Microsoft is worried about DRM and lock-in, and very little else. The work-around Microsoft is offering to get songs from its own store to its own player is the same avenue you have to take tracks from the iTunes Music Store and put them on the Creative Zen, Microsoft Zune or any other, which is to say, burn it to an audio disc, and then re-rip it to a pure MP3 format. Of course, any pure MP3 format will automatically import into iTunes and play on your iPod, as will any AAC (MPEG-4) audio file and may other wondrous standard files. The lock-in is there, and it’s around the digitally purchased music.
And this is the point really. An “iPod Killer” isn’t just the gadget. As Apple is only too aware and far too ingrained in, it’s the experience. It just has to work. This kind of disconnect between the vendor’s first store, and the new gadget and ecosystem that’s locked in, is a serious rupture in the brand identity of Microsoft in the music arena. This hurts the reputation grievously because, in the eyes of the average consumer, Microsoft music doesn’t work with my new Microsoft player. End of experience. The technical details and work-around are all there, and you can get it sorted with a chunk of work, but the point is the experience that is remembered is “I had to fiddle with it.” or “It didn’t even work with the music I bought six months ago from Microsoft’s music store!”
Personally, I feel DRM is a great hindrance, but then I also know that there’s a chunk of people out there that feel they should be able to get it for free if they can figure out how. Dishonest people cause DRM, and the rest of us honest folk are the ones it inconveniences. That said, I’ve never had a single problem with Apple’s FairPlay system, and I’ve transited across two macs and two iPods over my history with it, without a hitch, enabling all the Macs in our house to play the purchased music shared locally and legally, all within the enabling model of iTunes.
If the customers mattered, they would be able to download the tracks again in the Zune format, or would be given credit for that number of tracks with the purchase of a Zune, or at least some sort of nod to say “Thanks” for the patronage and faith in the failed first Microsoft Music Initiative. But then, as I note, this isn’t an iPod killer, because it’s not about the user.
I’ve tried (not bought) a number of the early and mid-iteration MP3 players, and none of them just work and let me do what I want, which is find, sort and listen to my music. Easily and without a pile of other features I don’t use. My whole 250+ CD collection of purchased music (both real disc and iTunes) all in my pocket. That’s what the iPod is about, and until this sort of nonsense stops and the focus is on the consumer that is shelling out for the music and the player, there won’t be a challenger, let alone a killer in the midst.
Disclaimer: I’m a Mac and iPod owner (obviously), work on Windows, still have Linux on servers and occasionally deal with Solaris still.
Currently playing in iTunes: Surface to Air by The Chemical Brothers
NSLU2 and Openslug on Mac OS X
Ok, I’m NOT an embedded hacker (unless I have to be) but I do like fiddling with the linksys hardware both the wireless routers and the relatively new NSLU2 attached storage device. But the firmware is old and kind of crotchety, and of course those passionate folks of all things GPL and other goodness have been at it hacking the firmware of this toy. Mac OS X is not the platform of choice to partake of this goodness, but with Fink, you can still get there at some levels.
The trick is, get the latest Fink and grab the latest OpenSlug or other Unslung firmware for the NSLU2 if you’re so inclined. If you don’t know what I’m talking about, this is NOT for the faint of heart, so read here and get acquainted or scared off as appropriate. With fink, get the *unstable* package for the libpcap (0.9.4 as of this writing) and then you’re set for some fun.
Now it’s time to do that wondrous compile thing. Problem is the configuration is all set up for the DarwinPorts which I’ve never felt is quite the level that Fink is at. So rather than pull all of DarwinPorts in, it’s actually a short trip to compile the upslug2 firmware updater software on your Fink-enabled Mac OS X system.
Grab the tar.gz for all platforms, unzip and untar it and then get into the directory. Using a mod of the “readme.macosx” your command line becomes:
CPPFLAGS=-I/sw/include LDFLAGS=-L/sw/lib ./configure –with-libpcap
That will get you the upslug2 executable, ready to rock and overwrite your firmware of the NSLU2 with all sorts of open source hackery. I find the fun the packages and customization more than the compile and tweak of the firmware itself. If you find yourself in a similar camp, these tips may be of some help to you.
Enjoy, and props to the Unslung and other NSLU hackers out there for getting these tools rocking and some seriously capable firmware going!
Currently playing in iTunes: The Power Of Love by Frankie Goes To Hollywood
Add for technorati
This is a claim post, so I’ll rub it out when Technorati gets its act together. Currently it only finds the other blog at http://codekensai.com/wordpress
Sigh.
Wait on the multicast – got me a Mighty Mouse
Well, I picked up a new Mighty Mouse that was on hold for me from my good
friends at Westworld
Computers and have been trying it out for a bit.
Truth be told, based on the reviews at places like Engadget
and others, the reviews tell you pretty much exactly what you’ll get
and what sort of experience you’re in for. So it’s all down to opinion
on what you think of it.
For a 1.0 piece of hardware, it’s Apple through and through. It’s got
those touches of elegance and a good helping of innovation, but maybe
some of it has a bit of rough edge to it. If you’ve ever acquired a
first release of one of their hardware platforms right out of the gate,
you know what I mean.
I think it’s a fine mouse. Maybe a touch pricey, but the feel of that
scroll wheel ball and the integration with Mac OS X is
very well done. The tiny little sounds created for the ball and the side
button pair are signature subtle but sufficient. It does have the issue
that you pretty much have to lift your left-click finger(s) off the
surface to get a right-click, and that to my taste, the side buttons
require a bit too much pressure to activate, but overall it’s very nice,
and for the average non-USB Overdrive using Mac user, this is a worthy
upgrade from the one-button or from the average Logitech optical scroll
mouse. When it does go wireless, I might need another one.
If you get a chance, try one out. I think it’s a personal taste, but by
the new control panel allowing things like the middle button/ball to
active the dashboard and the side button pair to active the application
switcher, it’s a mouse-only experience for surfing in a lot of cases
now. And coming from this keyboard-shortcut addict, that’s actually kind
of cool. If you want to really kick it, go with the USB Overdrive add-on
that is now supporting the Mighty Mouse, or really go nuts and get a
googol-buttoned USB input device with the Overdrive software.
XML programming languages????
Ok, WTF? Guy Steele Jr. is a pretty bright guy. I’m just getting into LISP
beyond the lightweight knowledge I had for tweaking emacs, and I have a
glimpse of what he’s talking about in LISP with extending the language.
Heck, it goes WAY beyond that. The article I refer to by Mr. Steele is here
.
There’s a really good quote about LISP, of which I dug up an incarnation
by John Foderaro here.
“…Lisp is a programmable programming language. Not only can you
program in Lisp (that makes it a programming language) but you can
program the language itself.”
Steele figures that the language will represent as XML due to the
popularity and tools available for it. XML will represent the structure
and higher abstractions in the language, and the syntax and onscreen
representation will become merely a view on the structure represented in
the XML. I think I get where he’s coming from, and there’s some merit
there, but I think that’s like saying C++ macros are as powerful as
defmacro in LISP. XML is hierarchical. It’s not freely associable, and
can’t being to represent effectively or efficiently something on the
order of what a language such as LISP does. There’s a disconnect in
level of abstraction and power there that I can’t effectively explain at
this juncture, but it’s there.
Programmers are starting to understand higher and more powerful levels
of abstraction, design, and systems. I think more people would grasp
LISP at a deeper level now than when it was introduced. I also think
that by and large, most developers still fundamentally won’t get it. The
majority of developers, even GOOD developers, just don’t think that way,
which is closer to why LISP, especially LISP in the fullest sense, just
didn’t go mass-market the way C and C++ (and Java) did.
I think Steele has a kernel of a good idea, in that it seeks to get the
structure, intent and design of the program into a form that can be
manipulated as the proverbial model in the MVC, and allow us to make
that view into a variety of representations beyond our convential view
of syntax. On that level, I completely agree that we need such a system,
but there’s a lot of groups trying to do that, and it’s complex.
Reducing it to XML is like saying the BNR of the system is all that’s
represented. That’s not a higher level. That’s machine code. We need to
be thinking at a higher level than that. We don’t need a representation
of our languages to be some slick transformable version of pretty. We
need those ideas and modules to be represented as groups and modules,
and ways of working with those groups and modules the way we work with
strings and ints. Steele has the right grit in the idea, especially with
the intro about the UNIX command line. But the power there is that those
strings and such are usually themselves the higher chunks of ideas, and
using those tools on basic sentences and dictionaries or using them to
format, sort, edit and update cross-references and bibliographies is
equally simple.
The difference in my mind is that the higher abstraction is that chunk
that the computer isn’t dealing with the in the case of the command
line. The chunks the filters and programs process are arbitrary
complexity. We need the system for defining, manipulating the
association of the filters, of the filters themselves, and of the
behaviour of the shell itself. That’s a big step above where we’re at,
and it’s what I would call System Syntax, as opposed to Language Syntax.
It’s there. You can see the outline by the topics that seem to be
gaining steam in our craft. Domain Languages. Language Oriented
Programming. Model Driven Design. Unlike 4GL and CASE tools, we are
approaching it as building layers of abstraction as opposed to the
ultimate syntax processor and generator. We’ve got the syntax processors
and generators in languages, compilers, virtual machines and runtime
environments. Components were clumping the syntax bits together. We
still don’t have a way of clumping ideas and processing chunks together
the way you can abstract an entire category of algorithms with a LISP
macro. It’s a totally different level.
Maybe Steele’s approach is a first step, or at least another path to
wander down to better understand the problem and find a solution. I just
think it’s somewhat the wrong direction, but hey, he’s got more cred
than I do. But I still trust my instincts.
Got a comment on this? Sorry, but the bloged system I’m fiddling with
doesn’t deal with that. Email me instead for now at (dallas (A) hockley
(DOT) ca). If I actually GET some comments, I may look into a comment
system as that would be an indicator this went beyond a diary.
Keep on thinking!