Awesome Conferences

Recently in Random thoughts or ideas Category

TOML vs. JSON

[This is still only draft quality but I think it is worth publishing at this point.]

Internally at Stack Exchange, Inc. we've been debating the value of certain file formats: YAML, JSON, INI and the new TOML format just to name a few.

[If you are unfamiliar with TOML, it is Tom's Obvious, Minimal Language. "Tom", in this case, is Tom Preston-Werner, founder and former CEO of GitHub. The file format is still not reached version 1.0 and is still changing. However I do like it a lot. Also, the name of the format IS MY FREAKIN' NAME which is totally awesome. --Sincerely, Tom L.]

No one format is perfect for all situations. However while debating the pros and cons of these formats something did dawn on me: one group is for humans and another is for machines. The reason there will never be a "winner" in this debate is that you can't have a single format that is both human-friendly and machine-friendly.

Maybe this is obvious to everyone else but I just realized:

  1. The group that is human-friendly is easy to add comments to, and tolerant of ambiguity, is often weakly typed (only differentiating between ints and strings).

  2. The group that is machine-friendly is difficult (or impossible) to add comments, is less forgiving about formatting, and use often strongly typed.

As an example of being unforgiving about formatting, JSON doesn't permit a comma on the last line of a list.

This is valid JSON:

{
   "a": "apple", 
   "alpha": "bits", 
   "j": "jax"
}

This is NOT valid JSON:

{
   "a": "apple", 
   "alpha": "bits", 
   "j": "jax",
}

Can you see the difference? Don't worry if you missed it because it just proves you are a human being. The difference is the "j" line has a comma at the end. This is forbidden in JSON. This catches me all the time because, well, I'm human.

It also distracts me because diffs are a lot longer as a result. If I add a new value, such as "p": "pebbles" the diff looks very different:

$ diff x.json  xd.json 
4c4,5
<    "j": "jax"
---
>    "j": "jax",
>    "p": "pebbles"

However if JSON did permit a trailing comma (which it doesn't), the diffs would look shorter and be more obvious.

$ diff y.json yd.json 
4a5
>    "p": "pebbles",

This is not just a personal preference. This has serious human-factors consequences in an operational environment. It is difficult to safely operate a large complex system and one of the ways we protect ourselves if by diff'ing versions of configuration files. We don't want to be visually distracted by little things like having to mentally de-dup the "j" line.

The other difference is around comments. One camp permits them and another camp doesn't. In operations often we need to be able to temporarily comment out a few lines, or include ad hoc messages. Operations people communicate by leaving breadcrumbs and todo items in files. Rather than commenting out some lines I could delete them and use version control to bring them back, but that is much more work. Also, often I write code in comments for the future. For example, as part of preparation for a recent upgrade, we added the future configuration lines to a file but commented them out. By including them, they could be proofread by coworkers. It was suggested that if we used JSON we would simply add a key to the data structure called "ignore" and update the code to ignore any hashes with that key. That's a lot of code to change to support that. Another suggestion was that we add a key called "comment" with a value that is the comment. This is what a lot of JSON users end up doing. However the comments we needed to add don't fit into that paradigm. For example we wanted to add comments like, "Ask so-and-so to document the history of why this is set to false" and "Keep this list sorted alphabetically". Neither of those comments could be integrated into the JSON structures that existed.

On the other hand, strictly formatted formats like JSON are, in theory, faster to parse. Supporting ambiguity slows things down and leads to other problems. In the case of JSON, it is just plain so widely supported there are many reasons to use it just for that reason.

Some formats have typed data, others assume all data are strings, others distinguish between integer and string but go no further. YAML, if you implement the entire standard, has a complex way of representing specific types and even supports repetition with pointers. All of that turns YAML's beautifully simple format into a nightmare unsuitable for human editing.

I'm not going to say "format XYZ is the best and should be used in all cases" however I'd like to summarize the attributes of each format:

* Format JSON YAML TOML INI
M Formal standard YES YES soon no
M Strongly typed YES YES string/int no
M Easy to implement
the entire standard
YES no YES YES
H Awesome name! no no YES no
H Permits comments no start of line only YES usually
H diffs neatly no YES (I think) YES YES
H Can be
programmatically
updated without losing
format or comments
yes-ish NO soon NO

The * column indicates if this quality is important for machines (M) or humans (H). NOTE: This chart is by no means complete.

Personally I'm trying to narrow the file formats in our system down to two: one used for machine-to-machine communication (that is still human readable), and the other that is human-generated (or at least human-updated) for machine consumption (like configuration files). (Technically there's a 3rd need: Binary format for machine-to-machine communication, such as ProtoBufs or CapnProto.)

I'm very optimistic about TOML and look forward to seeing it get to a 1.0 standard. Of course, the fact that I am "Tom L." sure makes me favor this format. I mean, how could I not like that, eh?

Update: 2015-07-01: Updated table (TOML is typed), and added row for "Awesome name".

How many times have you seen this happen?

Email goes out that mentioned a date like "Wed, Oct 16". Since Oct 16 is a Thursday, not a Wednesday (this year), there is a flurry of email asking, "Did you mean Wed the 15th or Thu the 16th?" A correction goes out but the damage is done. Someone invariantly "misses the update" and shows up a day early or late, or is otherwise inconvenienced. Either way cognitive processing is wasted for anyone involved.

The obvious solution is "people should proofread better" but it is a mistake that everyone makes. I see the mistake at least once a month, and sometimes I'm the guilty party.

If someone could solve this problem it would be a big win.

Google's gmail will warn you if you use the word "attachment" and don't attach a file. Text editing boxes in all modern web browsers and operating systems have some kind of live spell-check that put a red mark under a word that is misspelled. Some do real-time grammar checking too.

How hard would it be to add a check for "Wed, Oct 16" and similar errors? Yes, there are many date formats, and in some cases one would have to guess the year.

It would also be nice if we could write "FILL, Oct 16" and the editor would fill in the day of the week. Or a context-sensitive menu (i.e. the left click menu) would offer to add the day of the week for you. If the time is included, it should offer to link to timeanddate.com.

Ok Gmail, Chrome, Apple and Microsoft: Who's going to be the first to implement this?

I was only 7 months old when Neil Armstrong became the very first man to walk on the moon. I don't remember it very well.

Today I was reminded that most of what we see of the moon landings are highlights. 10-second little clips. I would like to know what the entire 8 days were like. I'm sure there are audio and video recordings of the entire thing. All of NASAs recordings are public domain, so they must be available somewhere.

Here's my thought for a product. A kit that includes audio and video recordings and other stuff to help you re-live the entire 8 day experience. An audio recording that we would listen to in real time, along with TV inserts of broadcasts as they happened. Plus 1960s recipes and other stuff so a group of people could simulate the entire thing. A group of people could go on "a vacation to 1969" and spend a week living like it was July 1969.

Yes, 8+ days is a very long time but imagine if:

  1. It was done near some other vacation place and they arrange it so that at key times you are near a TV to watch the news. Some days would be more "sit at the TV watching the action" and other days would be unrelated activities but everyone would watch the nightly news together at 6pm to see what Walter Cronkite was telling everyone.

or

  1. They make a simulator so that you are Neil Armstrong, or at least the Flight Director, going through the motions for all 8 days.

or

  1. YouTube could livestream all the audio/video for 9 days straight and everyone could just tune in. All over the world people would "play along", making it a shared experience everyone could enjoy. (It would be like The Yule Log, only a week+ long event that we do every July).

I haven't put a lot of thought into this. There are many logistical challenges. Plus, it could be extremely expensive to do it right. That's why I think a kit that lets people to it themselves during the summer would make more sense.

Anyway... I want to put it out there in case anyone has comments or thoughts about how to make it happen.

Tom

You may have read the Popular Science article:

Thieves Stole $45 Million From ATMs Because The U.S. Uses Absurd 40-Year-Old Technology

Let me quote:

So why is the US so far behind? Infrastructure is a major factor; countries like Japan and the UK are much smaller, so replacing all the old point-of-sale machines and ATMs is easier.

Bullshit.

Bullshit. Bullshit. Bullshit.

The reason is that bank executives had the choice between paying a lot of money to do the right thing or a little money to consultants who would tell them what they wanted to hear. It's a big win for consultants.

WHY IS POPULAR SCIENCE BEING SO ANTI-CONSULTANT???

Everyone got what they asked for. What's so bad about that?

And besides, I'm sure the banks are insured for this kind of thing.

The real headline should be, "Insurance companies lose $45 million from signing contracts with banks that couldn't care less because they've signing contracts with insurance companies that remove the need for them to give a shit."

Amirite? (No, really, can someone from the banking industry confirm?)

...that I got caught in a "spear phishing attack". (A malware attack where they send an email specifically crafted to one or two people.) The email was a receipt from a hotel that I stay at occasionally but it listed the address as being in South Carolina instead of San Francisco. I clicked on the PDF to read it and then realized I was being phished because I haven't been to South Carolina in ages and the invoice mentioned a coworker that I've never traveled with. I started shutting down my computer and made plans to wipe the disks; glad I have good backups but not wanting to go through the pain of being without my laptop until I could do this.

That's when I woke up.

Yes, it was a dream.

I have friend that only click on web links if they are on a ChromeOS machine. The use many machines but if they get a link that is outside their domain they move it to a ChromeOS box to click on it. That's an interesting discipline to develop. I wonder how soon more people will do that.

It used to be there was a small group of people that were extremely paranoid about giving out their social security number or credit card numbers. At the time people called them "paranoid". Now there is this thing called "identify theft" and those people are considered to be "forward thinkers".

I wonder what paranoid behavior today will be normal in the future.

IT systems have many parts. Each needs to be upgraded or patched. The old way to handle this is to align all the individual release schedules so that you can make a "big release" that gets tested as a unit, and released as unit. You can do this when things change at a sane rate.

Now more things are changing and the rate is much faster. We also have less control. Operating systems have frequent patches. There are urgent security patches that need to roll out "immediately". Applications have frequent updates, many even upgrade themselves. Our PCs have firmware updates for the BIOS, the keyboard, the IPMI controller, the mouse (yes, my damn mouse needed a flash update recently!). There is no way we can align all these release schedules, test as a unit, and release it as a whole.

The situation is the same or worse for web services. The whole point of a Service Oriented Architecture (SOA) is that each piece is loosely coupled and can be upgraded at its own schedule. If every service you depend on is upgrading out from under you, it isn't possible to align schedules.

The old best practice of aligned release schedules is becoming less and less relevant.

I'm not saying that this is good or bad. I'm saying this is the new reality that we live under. In the long term it is probably for the best.

My question for the readers of this blog are: What are the new tools and best practices you use that address this new paradigm?

Metcalfe's law

Metcalfe's law states that the value of a telecommunications network is proportional to the square of the number of connected users of the system (n^2).

Robert M. Metcalfe, the inventor of Ethernet, originally meant it to apply to devices on a network that could communicate with each other. It isn't sufficient to be on the same network if they speak incompatible protocols. It isn't sufficient to speak compatible protocols if they aren't connected.

A more plainspoken way to state Metcalfe's law is that every one new user added to a network makes the network more than one unit more useful.

A more simple way to understand this law is: "The first person to buy a fax machine was a fool." Imagine how useless it would be to be the only person in the world with a fax machine. You can't send anyone a fax because nobody else owns a fax machine. When two people owned fax machines the utility or usefulness became a lot more, assuming those two people needed to communicate. When 100 people owned fax machines it was more than 50 times more useful than when two people owned them because now 99 people all had 99 other people that could communicate: 9801 pairs. Maybe not all 9801 pairs would be used but the network had a lot more potential than when only a few people had fax machines.

When Metcalfe invented Ethernet very few computers were connected to each other. Communicating between computers usually meant writing data to media such as tape or a "disk pack" and physically moving the media to the other computer. This was often done even if the computers were right next to each other. Think about all the old movies you've seen where computers have tape drives spinning big loops of magnetic tapes. That's how data got between computers.

Whether Metcalfe's law is exaggerated or misapplied to other things, the general point is correct: Linear growth in the number of users creates superlinear growth in the network's usefulness.

I think every sysadmin should understand this law. I think we "get it" as far as the literal sense: We get that more connected computers is more useful. We gain huge satisfaction when we add a device to our network; especially if it is one that previously couldn't connect such as when WiFi is added to a home thermostat, television, or phone. We get that more compatibility within our network is more useful. We are frustrated when two software systems can not talk with each other; we get huge satisfaction when vendors provide standard interfaces, APIs, and file formats so that more things are compatible.

On the social level Metcalfe's law applies as well. If you belong to a local or national user group gaining more members isn't just a matter of pride. Every additional member adds to the potential utility of the group. Every active member adds utility superlinearly. If you are a member of such a group, getting your friends and co-workers to join (or getting current members to be active participants) benefits you and all other members more than you'd think.

American Scientist has an article that (finally!) explains homomorphic encryption in simple enough terms that even I understand.

Homomorphic encryption permits me to send you encrypted data that you can manipulate but never know the contents. You send it back to me, I decrypt it, and see the result. Imagine if a web-based wordprocessor could store your document, edit your document, but never know what your document says. Yes, it sounds crazy but it is theoretically possible. In the last 4 years that theory has been getting closer and closer to reality.

I think sysadmins should read this article to get an idea of what crypto might be like in the future.

Alice and Bob in Cipherspace: A new form of encryption allows you to compute with data you cannot read

Posted by Tom Limoncelli in Random thoughts or ideas

AT&T Survey

I got a survey from AT&T Wireless that asked a lot of questions comparing my experiences between WiFi and 3G on my AT&T mobile phone.

If I were to reverse-engineer what they were getting at, either (a) they want to figure out why I dislike WiFi so they can fix those problems and encourage people to move traffic off their over-stressed 3G network, or (b) they need data to back up their coming campaign to bad-mouth WiFi and tell everyone to pay for their over-priced 3G.

Based on the tone of the questions, I really think it is "b".

I'm so glad I no longer work in the telecom world.

Dear universe,

There are 10+ different organizations that have to give me some kind of tax form so that I can file my taxes. I'm really happy that they are now all electronic. It is much easier to download them off the organization's website than to get them in the mail.

However, if these organizations are going to generate a PDF, can't they also generate a .irs file? A .irs file is an imaginary XML format that I wish existed. It would include all the data from the PDF but in a parsable format. I could take all the .irs files put them on a USB stick and hand it to my tax preparer or feed them into TurboTax.

I know it wouldn't completely automate tax preparation, but imagine how much easier tax prep would be if this existed? I wouldn't have to suffer though watching my tax preparer slowly retyping things (Why am I paying him to do that?). Tax prep software would take each file and, possibly with a little human help, know where to apply the numbers.

I can't be the first to have thought of this.

Why doesn't it exist?

This is a good time of the year to send email to your employees reminding them that if they are going to be away for the holidays they should take some steps to save energy, and money.

(free free to cut and paste from this... I stole it from somewhere else too)

Some simple steps you can take:

  • Power off your monitors. Many people leave their monitors on or in standby, which uses more power than actually turning them off.

  • Shut off or put your computer in sleep mode if you won't need your computer at all over the holidays. Note: some machines don't have an effective sleep mode so if possible, turn them off.

  • In general, put your machine in sleep mode when you are away from your desk, especially at night or on the weekends. Even if you're not on vacation this week, use sleep mode to help save energy and reduce emissions.

  • Set your screen saver to "blank" if you do need to leave your machine on so you can access it remotely. Screen savers can cause some processors to use more energy than just a blank setting, and they aren't even necessary for monitors anymore - other than providing nice eye candy for someone watching. (generating those fancy graphics requires extra CPU work and that consumes more energy)

  • Unplug other electronics. While you're at it, please check your area for other non-essential loads that could be unplugged or turned off over the holiday (lamps, holiday lights, electronic picture frames, etc.). This is a good idea to do all the time, not just before the holidays. Even though we employ motion detectors and timers to shut down lights, these aren't everywhere, and it saves even more if you manually turn lights and projectors off when leaving the room.

  • Do NOT shut off a computer that isn't yours. Ask the owner FIRST or offer to power it off for them. Some machines may be controlling critical processes that run when people aren't around or have a special shutdown procedure (not just "hit the button"). Of course, any computer in a "computer room" or "computer closet" should not be powered off except by the IT staff. You wouldn't want us to not be able to process invoices (or payroll!) because someone powered off the wrong machine.

Finally, if you're traveling away from home you might consider unplugging things at home, too - helping the environment a bit while saving your money. http://standby.lbl.gov/summary-chart.html shows typical savings for appliances. For more tips the EPA http://www.epa.gov/epahome/hi-winter.htm#effective and Sierra Club http://www.sierraclub.org/tips/holidays.aspx have suggestions.

Have a great and safe end-of-the-year whatever-holiday-you-celebrate-even-if-it-is-none time of the year!

Quicken 2010

Gentlemen, set phasers on "grumpy rant".

I've been using Quicken since around 1995ish.

For the first time since my finances got complicated (10 years?) the "one step update" now smoothly updates all my accounts, downloading entries from all my various financial institutions. (Well, almost.  My low-tech mortgage company doesn't offer the ability to download updates. Why should they? They're too big to fail.)

All it took was spending an entire afternoon calling each and every institution's support number to find out what was wrong and how to fix it.

Truly the mark of high quality software.

I tried Mint.com and it does all the updates for me. Sadly, it doesn't do the things I currently do with Quicken.  Now that Intuit (the maker of Quicken) has bought Mint.com, let's see how quickly they can fuck it up.  No offense, Intuit, but the history of the software industry is rife with stories of mergers that sink both the buyer and buyee.

[ This is still "first draft" quality but I'm posting it rather than keeping it bottled up. Feedback appreciated.]

There are those that believe that the history of system administration will follow a similar path to electrical engineering. Broadly categorized, there are 3 types of careers in that area:

  • Electricians: People that have limited scientific education, but though apprenticeships and certifications they do the majority of the work in buildings, both deployments and repairs. They "follow the building code" (the building and safety guidelines for their state or country) but couldn't write new build codes (and would never try). Inspectors are paid to check their work for conformance to the "building code". 80% of all electrical work is in this category, and it is usually thankless and boring.
  • Electrical engineers: People that have university degrees and understand both the theory and practice of what they do. They specialize in specific areas (construction, circuit design, chip design, etc.). The design new products. More advanced EEs write the building codes that electricians follow.
  • Researchers: People (typically with PhDs) that are advancing the science of electrical engineering. They may invent entirely new ways of doing things, rather than just new products.

The field of system administration is already following this kind of trajectory. There are people in that first category: they have Cisco, MS, and LPI (Linux) certifications, they are mostly deploying vendor-approved architectures and design patterns (known as "best practices"). When they get creative you should be as scared as you would if an electrician installing a new circuit in your house told you he "got creative"). We don't have the auditing or inspection system yet, but SOX is the closest we have.

System administration has that second category too. They usually are the senior sysadmins in a company, and often are employed by vendors to create the best practice documents and certifications used by the first category. Sadly they often have the same titles as people in the first category which creates confusion.

The third category is quite rare in system administration. How often in our lives will something be invented that radically changes the way we do IT? There are a few that I can think of: Local storage vs. remote storage NFS. Individually managed accounts on each machine to NIS (laterLDAP). Waiting for users to complain vs. monitoring for outages. Keeping machines in sync by hand vs. cfengine (later Puppet).

All of these were major changes to our industry (and I profess that 80% of the industry doesn't do most of those things yet, so there is plenty of work to do).

There are very few schools that have Masters or PhD programs in system administration. Some call it IT, and dilute it with a lot of research around what we used to call MIS. A lot of the innovation in system administration comes from industry, which is usually good, but sometimes taints the research.

I believe there are many interesting areas of research that need more effort:

  • Why are good practices so rarely adopted?
  • What prevents a constant number of sysadmins from administrating growing populations of machines or users?
  • Why is debugging so complicated?
  • How to organize teams of system administrators to maximize macro efficiency and personal efficiency?
  • How to delegate to users without expecting users to be system administrators?
  • What traits do successful system administration organizations share?
  • Are we asking the right questions?

These are the same questions we've always asked yet the need for research grows as system administration becomes more complicated and society becomes more dependent on technology.

Maybe we need to write less code and spend more time thinking.

When I see really good instruction books, I smile.

Writing instructions on how to assembly a product is hard. Your audience is most likely coming at the situation "blind" with little experience, and there is little reason for them to invest time in becoming an expert in assembling the product because once it works, that knowledge becomes useless. Get it done and forget about it. Unless, of course, you plan on buying the same item again and again. With something like a bookcase, that isn't true.

Compare the typical furniture installation guide to what you get with furniture from Ikea. I don't think good instructions like this are an accident. I assume (can someone from Ikea confirm?) that they do usability studies: people with little experience are given the product and asked to install it as trained professionals watch them through a one-way mirror. The documentation is improved based on observations and interviews.

Now let's talk about the Apple Xserve installation process. I've installed, maybe, 6 Apple Xserve computers. The instructions frustrating. You continually keep saying to yourself, "Why would they want me to do that?" and yet, if you skip those steps you end up regretting it later, often having to start over from scratch. "Oh, now I understand. If you don't use your left elbow to support the frizenfrats, your right hand won't be free to slide the wingding into the snortplex." The design of the Apple Xserve is extremely elegant, but you don't realize it until you are done thus it is tempting to not follow the instructions precicely. If I went more than a month without installing an Xserve I would forget how to do it and feel like a newbie all over again. When you do it their way everything "just works." If only they could explain at the beginning what it would look like at the end. Word won't do justice to it. It's like explaining to a blind person what "red" looks like. I wish Apple would include a video showing how it looks and works when it's done. Demonstrate how it slides in and out of the rack so well. Then walk through the installation process, showing how the case is actually what gets attached to the rack, and the guts slide into the case. (If you've installed one of these things, you know what I mean.) After watching that, the user is no longer doing the installation "blind" and actual installation time would be dramatically reduced. Much less frustrating.

Sometimes the problem is simply fear of the unknown. People generally fear messing with the crazy, messy, wires in a home entertainment system. Consider the Tivo with its multitude connectors for video out: composite, HDMI, other acronyms I can't remember. They've put out a video that shows the installation process. I recommend people watch the video even if they don't own or plan on owning a Tivo. It's a good example of how to do a video introduction to a complicated process. The video is shot in a living-room, not a lab. Notice that the voice is friendly, plain-spoken (referring to "that cable guy"), even making commentary ("I love it when it's color coded") and using slang ("this little puppy"). It starts with an overriding premise: all connections will be "from out, to in" and this phrase is emphasized throughout. Watch it and understand.

Does your company make a product that would benefit from an installation video?

Ten years ago: Caller ID? Hell no! I'm gonna get it blocked! This is a total invasion of privacy!

Today: I refuse to order pizza delivery from that place until they get a caller-id system so I don't have to repeat my address to them every time I call in an order.

The sad part is that we now go to the "not as good food, but they use my caller-id bits" place instead of the "great food, takes forever to place my order because they don't take advantage of caller-id" place.

I think the morale of this story is that people will gladly give up a reasonable amount of privacy if they get some value for it. Plus, in this case they are only getting the information that I would want them to have anyway. I am going to tell them my address (so they can deliver the food) and giving them my phone number is a reasonable thing in case they need to reach me to ask a question. Plus, all of this is in the phone book. If I wanted to keep it all secret I could pick up the food myself.

Banks are in a different situation. They seem to want to collect tons of information, not all of it obviously needed. When ordering a credit report from Equifax they ask for all your previous addresses, which they then use to supplement the information that they have about you. (All they really need is my SSN and full name, plus the address I want the report delivered.)

It appears after years of criticism, Diebold may be ready to withdraw from electronic voting entirely. The company is concerned that this relatively small and marginally profitable unit is hurting the company's overall image.
Diebold Weighs Strategy for Voting Unit on Wired.com

If voting booth manufacturing is so lacking in profit, maybe all such vendors should get out of this business. In some countries the government has an independent division that runs the voting system. Just like the military is an independing branch of the government, the election commission develops voting technology, tests it, and runs the elections. It is an honorable group, held in high esteem, with very high standards.

OOPSLA request

It would be great if the annual OOPSLA conference added rock concerts and big parties every night. They could change their name to OOPSL-A-palooza.

Some days you are master of a huge data center. Other days you're just sittin' on the floor outside the "computer closet" trying to figure out why the new cheap servers won't boot.

That's just how it works.

LOTD

So Microsoft has announced that their new operating system shall be called "Windows Vista." And we are assured that the new name is not an acronym for Viruses, Infections, Spyware, Trojans and Adware.

http://www.livejournal.com/users/lilbjorn/12180.html

IVR that doesn't suck

People tend to hate those Interactive Voice Response systems ("press 1 for sales... press 2 for tech support... etc.") and so do I. However I was recently impress with something that Verio does on 1-800-GET-VERIO. A man speaks the odd-numbered choices and a female speaks the even numbered choices.

This alternating voices really made it easier to concentrate on what they were saying.

It reminded me of my days on highschool radio station WJSV (90.5FM) where I was taught to always pair two strikingly constrasting voices on any program we produced. The easiest way to do this was to have one male voice and one female voice. The results was an audio presentation that was easier on the ears.

This kind of attention to detail is appreciated.

Two ideas, the blog

I've been enjoying Two Ideas lately. You might want to check it out.

Musical Banner Pages

My cell phone has different rings for different people, and specific rings for "person not in address book" and "caller id is blocked."

What if I could upload MP3s to printers which would play different songs based on what was being printed?

  • Play an old-tyme ragtime song when plain ASCII was printed
  • Print a slow, limbering, elephant-like, song when someone prints something with lots of graphics
  • A series of songs could play as the printer starts running out of paper: Meet Me Half Way when the paper tray is half empty, Is That All There Is when it runs out, and so on.
  • And of course, be able to set your own personal MP3 for your printouts so you can hear when they are done.

Why hasn't anyone thought of this before?

Credits