Alright, I have to write about this, if only for my own personal time-stamp. It's probably in no way worth reading.
A few years I started thinking "Why doesn't Macromedia and Adobe merge?" It seemed pretty obvious to me. Then a few years later it happened. At the time, I was also thinking that they could buy a Linux company and produce a bootable Live-CD with trials / serial activated version of their software suite. Well, that last part never happened.
Then, when the SCO case came out, I was thinking "Hey, this is totally crap. It's not going to work out for them. Too bad I can't short their stock". It was all new then, but turns out that's how it turned out. Nothing exciting there, most slashdot nerd were thinking the same thing.
I was also wishing I could buy Google stock way back when it was growing but everyone thought it was going to stop and have some kind of correction. That one would have worked out too. But alas, I'm not an investor.
Anyway, I've been reading and watching and listening to a lot of personal computer history lately, and in particular I've been reading about Commodore in On The Edge. As I'm reading this, I'm thinking "Hey, how come Apple doesn't make even more stuff in house?"
Think about it. Back in the day, making a computer was really making a computer. It lots of ROM code and chips and video circuitry and disk controllers and figuring out where things go and how they should work. Today it's basically a CPU, almost always an x86, and a chipset and you're ready to roll.
I was thinking, like Commodore bought MOS Technologies in search of vertical integration, why doesn't Apple buy some chip company?
I just about lost my mind just now when I happened to catch that Apple has done exactly that.
So, from now on, I'm going to post all my crazy little ideas of what I think could, or might, or even should happen, just so I can say, if only to myself, "See, I saw that one coming."
Also, just for the record, and I have little hope, proof, or even supporting ideas, but I think Apple could get into the car market, as in building embedded systems for cars. Don't ask me why. I have no clue. I just see Apple in cars. The way Microsoft thought it would but never did.
Sunday, April 27, 2008
Wednesday, April 23, 2008
Living Frugal Isn't very Rock and Roll
Being rich isn't cool. Spending money like you don't care is cool.
What's cool about being cheap? There's no payoff for bystanders. When someone is totally rich, drives a beat-up old car, shops at thrift stores, buys and sells things used on action sites and classified, and generally doesn't go out throwing lavish parties or having opulent dinners, then anyone else who's standing on the bylines isn't getting one very cool payoff. If they themselves aren't wealthy, it's always at least a little fun to watch and live vicariously through someone else's over-top-escapades.
The interesting effect of this is that there's a social reward feedback loop that takes place when someone within a social groups starts earning what they perceive to be as a lot of money. You get to be a little famous for being Mr. Moneybags.
The payoff is even higher if you do silly and outrageously unnecessary things. This is the essence of 'bling'.
So if someone is in a peer group of people who don't come from a history of money, and that person then makes a lot of money, they are highly likely to take a few splurges here and there, which get the attention of others, creates some social validation, and re-enforces the entire cycle until the wealthy person is wealthy no longer.
You can see this statistic that the average lottery winner goes back to having a job and roughly similar lives to their previous ones. All in about 2 years. [ref 1 needed]
Anyone who's ever got a big raise at work knows that, when you do the taxes and deductions, it never ends up feeling like a who lot when you get your next paycheck. But it's not that numbers that ends up driving the purchases. It's the perception of wealth of whatever their current salary and position are dictating that drives this. It's probably different for everyone.
But the net effect is the same. Saving money just isn't cool. It definitely isn't Rock Star livin'. Just ask Willy Nelson.
http://moneycentral.msn.com/content/SavingandDebt/P75072.asp
What's cool about being cheap? There's no payoff for bystanders. When someone is totally rich, drives a beat-up old car, shops at thrift stores, buys and sells things used on action sites and classified, and generally doesn't go out throwing lavish parties or having opulent dinners, then anyone else who's standing on the bylines isn't getting one very cool payoff. If they themselves aren't wealthy, it's always at least a little fun to watch and live vicariously through someone else's over-top-escapades.
The interesting effect of this is that there's a social reward feedback loop that takes place when someone within a social groups starts earning what they perceive to be as a lot of money. You get to be a little famous for being Mr. Moneybags.
The payoff is even higher if you do silly and outrageously unnecessary things. This is the essence of 'bling'.
So if someone is in a peer group of people who don't come from a history of money, and that person then makes a lot of money, they are highly likely to take a few splurges here and there, which get the attention of others, creates some social validation, and re-enforces the entire cycle until the wealthy person is wealthy no longer.
You can see this statistic that the average lottery winner goes back to having a job and roughly similar lives to their previous ones. All in about 2 years. [ref 1 needed]
Anyone who's ever got a big raise at work knows that, when you do the taxes and deductions, it never ends up feeling like a who lot when you get your next paycheck. But it's not that numbers that ends up driving the purchases. It's the perception of wealth of whatever their current salary and position are dictating that drives this. It's probably different for everyone.
But the net effect is the same. Saving money just isn't cool. It definitely isn't Rock Star livin'. Just ask Willy Nelson.
http://moneycentral.msn.com/content/SavingandDebt/P75072.asp
Saturday, April 19, 2008
LCD vs. CRT vs. Trees vs. Michael Bay?
Anyone browsing craigslist recently would have probably noticed something in the technology section. There are tons and tons of free CRT monitors unable to find a home.
I don't know what the environmental harm this mass exodus of the mighty CRTs from our businesses and homes will be, but it'll probably be bad. In fact, one could figure out the 'environmental-return-on-investment' or EROI, by calculating some index of environmental harm, and comparing it to the environmental benefits of the power saving of LCD's.
It's funny to think that something that solves one problem, power-hungry CRTs burning up energy, creates another, CRT landfill, and that at some point in the future enough time will have passed such that the second problem will be exhausted (the damage from CRTs) while the benefits of the first will have aggregated over time. I for one do **not** volunteer.
However, what we can do is measure the power differences! Is it science? Who cares, it's fun! So I whipped out my Kill-o-Watt and did some measurements.
It turns out that, the way you use **both** changes the power usage. Here's the raw data:
- CRT Monitor -
All White Screen: 90 watts
All Black Screen: 71 watts
Average: 80.5 watts
- LCD Monitor -
All White Screen: 28 watts
All Black Screen: 29 watts
Average: 28.5 watts
- Some interesting numbers -
The CRT uses 27% more power displaying an all white vs. all black screen
The LCD uses 4% less power displaying an all white vs. all black screen
The LCD uses 35% less power, on average, overall of a CRT
It turns out that, as no surprise, a CRT monitor displaying nothing but black uses less power then when the screen is all white. This is because the gun firing at the screen is doing mostly nothing when drawing black. When the screen is white, the gun is firing all the time, in order to light up the whole screen, and so you get 20 more watts of power drain.
This is pretty interesting, since we usually don't think about **how** we use something as affecting the environment, especially when it comes to technology. If you have a CRT monitor, and a really bright and mostly white screensaver and desktop wallpaper, you're using up as much power and money, and doing as much damage as possible.
However, I didn't think that LCDs would register a noticeable difference, never mind the opposite effect! But it makes sense. With an LCD screen, the bulb inside is running all the time, regardless of what we are doing. However, the screen itself acts like a series of small, electronic sunglasses, turning on completely to block out all the light and create a black pixel, and turning off completely letting the light through and creating a white pixel.
It may only be 1 watt of difference, but it's interesting just the same. That means, if your screensaver, desktop wallpaper, and daily activities are black or mostly dark, then your maximizing the power, money and adverse environmental effects of your monitor.
But clearly, the LCDs use so much less power, you're still better off going LCD, for your power, money and environmental karma.
But here's what I really want to know:
What did Google's all white homepage cost the environment in carbon emissions back when CRTs were the common monitor?
What does Windows XP's default mostly black screensaver cost the environment in carbon emissions now that LCDs are the common monitor?
How many trees cry every time a Michael Bay movie explosion whites out a screen on a home theater system?
Friday, April 18, 2008
Old Computers
First of all, I love computers. New and old. Technology is great. Sign me up and plug me in!
I was talking to my brother recently about old computers and what it is I like about them.
There are a lot of things about old computers that are great. Just tonight I realized that you could actually have a fairly complete understanding of a 1980's personal computer from top to bottom. From the logic gates, to the CPU, whether it's a Z80, MOS 6802, or Intel 8086, through to the operating system, and right up into the program. Head to toes.
Today when you take a computer science course, you learn little pieces here and there, a logic gate, or the ideas of a low level programming language, often in the form of a emulated CPU, or maybe a fictional CPU designed for learning. Maybe you even build a binary adder. But then it's back to learning something useful Java or C# or something.
Maybe I'm wrong, it's not like I'm in university right now, however, it seems like learning some of these basic things by making a logic gate or adder or something, feel like growing a single blade of grass as starting point of learning football. So far and away.
It seems like making computers do things is mostly a case of taking an x86 based CPU, some fully realized OS, a byte code based virtual machine, a whole wack of libraries and API's, a editor and IDE environment with a full knowledge of the API's, and boom - you finally start making something.
This makes sense as it evolved out of needs to encapsulate layers of very annoying details away so that it's easier to create and maintain your project. Or perhaps your piece of the project.
I think this is all great, really I do. Ruby on Rails is really quite awesome!
It all seems like so much. So many layers, and pieces, all of which are moving targets, and all of which need constant refreshing. It seems like we went from swimming in different, but well known ponds - Apple, Commodore, IBM, DOS, CP/M - to one great big ocean. It's just overwhelming to understand, and sometimes underwhelming to explore. Things got more complex and more homogeneous.
Using computers today seems so much like watching TV. The web pages turned into databases which turned into web apps which turned into social web apps which turned into 1-click perpetual payoff. Meanwhile, desktop apps are more like web pages.
But I think the experience of using computers today has lost something else.
You see, the GUI interface changes the very idea of what using a computer should be.
Let's take a side trip. Imagine your living millions of year ago in a thriving village, and things are so well, that there's lots of food and no worry of war or scary animals. No imagine your belly is full and looking out into the wilderness. What would you be thinking?
Now imagine your living in a modern city today, and you sitting at a restaurant, after eating, and looking at a desert menu. What you are thinking now?
When the GUI first come out on the scene, it offered a very friendly way of using a computer. Instead of having to know a bunch of stuff, you are presented with a list. All the functions and programs of the computer are laid out in a nice little menu, complete with pictures.
What this is saying to the user is "We've figured everything out for you, and here's all the things you can do. Wouldn't you like to pick something to do? We've made a nice menu for you to choose from."
With the advent of the modern internet, that desert menu is pretty damn big. So big, in fact, that you can spend an almost indefinite period of time getting entertained by it.
But way back in the early days of the personal computer, this is what it was like.
You got a prompt. A flashing green cursor.
You don't get to pick from a list of things to do. You can do anything you want. You can do anything.
There's no list. There's no waitress walking you pleasantly through glossy catalog of choices.
You're faced with your own two hands and ability to explore and create. That flashing green cursor is sitting there waiting for you to go, or do, or be anything you want it, or yourself to be.
It's a canvas, not a catalog.
I realize that most people know what they want, and they like it when someone else has figured it all out for them. What I wonder about is the kids.
They get blasted into the ocean full of high speed internet whiz bang, instead of sitting on the shore, wondering what's out there, and maybe making their own raft.
I was talking to my brother recently about old computers and what it is I like about them.
There are a lot of things about old computers that are great. Just tonight I realized that you could actually have a fairly complete understanding of a 1980's personal computer from top to bottom. From the logic gates, to the CPU, whether it's a Z80, MOS 6802, or Intel 8086, through to the operating system, and right up into the program. Head to toes.
Today when you take a computer science course, you learn little pieces here and there, a logic gate, or the ideas of a low level programming language, often in the form of a emulated CPU, or maybe a fictional CPU designed for learning. Maybe you even build a binary adder. But then it's back to learning something useful Java or C# or something.
Maybe I'm wrong, it's not like I'm in university right now, however, it seems like learning some of these basic things by making a logic gate or adder or something, feel like growing a single blade of grass as starting point of learning football. So far and away.
It seems like making computers do things is mostly a case of taking an x86 based CPU, some fully realized OS, a byte code based virtual machine, a whole wack of libraries and API's, a editor and IDE environment with a full knowledge of the API's, and boom - you finally start making something.
This makes sense as it evolved out of needs to encapsulate layers of very annoying details away so that it's easier to create and maintain your project. Or perhaps your piece of the project.
I think this is all great, really I do. Ruby on Rails is really quite awesome!
It all seems like so much. So many layers, and pieces, all of which are moving targets, and all of which need constant refreshing. It seems like we went from swimming in different, but well known ponds - Apple, Commodore, IBM, DOS, CP/M - to one great big ocean. It's just overwhelming to understand, and sometimes underwhelming to explore. Things got more complex and more homogeneous.
Using computers today seems so much like watching TV. The web pages turned into databases which turned into web apps which turned into social web apps which turned into 1-click perpetual payoff. Meanwhile, desktop apps are more like web pages.
But I think the experience of using computers today has lost something else.
You see, the GUI interface changes the very idea of what using a computer should be.
Let's take a side trip. Imagine your living millions of year ago in a thriving village, and things are so well, that there's lots of food and no worry of war or scary animals. No imagine your belly is full and looking out into the wilderness. What would you be thinking?
Now imagine your living in a modern city today, and you sitting at a restaurant, after eating, and looking at a desert menu. What you are thinking now?
When the GUI first come out on the scene, it offered a very friendly way of using a computer. Instead of having to know a bunch of stuff, you are presented with a list. All the functions and programs of the computer are laid out in a nice little menu, complete with pictures.
What this is saying to the user is "We've figured everything out for you, and here's all the things you can do. Wouldn't you like to pick something to do? We've made a nice menu for you to choose from."
With the advent of the modern internet, that desert menu is pretty damn big. So big, in fact, that you can spend an almost indefinite period of time getting entertained by it.
But way back in the early days of the personal computer, this is what it was like.
You got a prompt. A flashing green cursor.
You don't get to pick from a list of things to do. You can do anything you want. You can do anything.
There's no list. There's no waitress walking you pleasantly through glossy catalog of choices.
You're faced with your own two hands and ability to explore and create. That flashing green cursor is sitting there waiting for you to go, or do, or be anything you want it, or yourself to be.
It's a canvas, not a catalog.
I realize that most people know what they want, and they like it when someone else has figured it all out for them. What I wonder about is the kids.
They get blasted into the ocean full of high speed internet whiz bang, instead of sitting on the shore, wondering what's out there, and maybe making their own raft.
Subscribe to:
Posts (Atom)