Tuesday, December 03, 2013

Really Simple Syndication On-the-Go

With the number of cellphones and tablets increasing by the day in every country, a major shift to information access is being witnessed.

The Internet and its predecessors, the Bitnet, ARPANET, EARN and list servers, were initially developed for information sharing and searching for reference documents. The researchers at UCLA and SRI – who were the first to connect two networks with the aim of establishing a global interconnected network – could never have foreseen Project Glass, Skype or YouTube. The phone-in-a-watch was known only to Dick Tracy fans who would have themselves chuckled at this impossible notion of gadgetry ever becoming a reality.

Yet, here, we are. We have all this and much more yet lies behind locked doors. Innovative products take up to a couple of years of design and testing under near-draconian secrecy levels before hitting the market. This truly is the Information Age.

On the other hand, the art of reading is fast reducing. Under the deluge of information, no one has time to do more than skip over prĂ©cis as more, more and yet more new information keeps coming in. It is truly a wonder why the whole Internet and all its users don’t disappear in a puff of smoke.

With the large scale proliferation of mobile devices – smart phones, tablets and such – a number of developed countries are facing a strange shortage: The ether is filling up. That is to say, the frequency spectrum used by transmission technologies such as GPRS and LTE is becoming saturated. But the demand and user base keeps growing.

The major shift hinted to at the start of this discussion is that the focus is no longer on establishing a presence on the Web – that has already been done. Instead, now the focus is on getting information across. It has to be short and to the point, and it has to be delivered quickly as on mobile devices users do not open multiple tabs nor have multiple monitors.

This led to web sites that aggregated content, such as Yahoo and Google News and ultimately to aggregation software. Whichever such application is seen, each has one aim: Get the information to the user.

The original idea of article exchange using a standard XML syntax between websites was developed in 1999 by Netscape but it wasn’t until 2005 that it garnered major support from Microsoft and Opera.

Rich Site Summary, also known as Really Simple Syndication or, simply, RSS, quickly became the de facto standard mainly due to its openness and ease of implementation. The advantage to news and article via RSS quickly became visible to publishers and the RSS logo became one of the first common icons that one could see on most such web sites.

The screen sizes of most current models of smart phones and tablets make these device-owners the perfect target audience. Syndication services and programs pull in feeds from around the world and summarize them into easy to read packages. Two of the most popular programs are Flipboard and Google Currents. Both of these offer similar facilities allowing users to quickly skip and skim thru items at speeds that regular desktop computer users would not have thought they would ever need.

There are two major advantages to generating RSS feeds from sites that deal with articles such as news and magazines. Firstly, for an RSS version, the web site’s programming requires addition not a change. Hence, RSS generation may be added to without affecting the existing functionality of the web site.

Secondly, the standard markup in RSS does not deal with formatting, allowing the syndication service or application to supply its own formatting. This enables the same feed to be formatted for a variety of devices, particularly smart phones where the screen size is limited.

Further options allow syndication programs to store and cache the feed for offline reading for those places without Internet access.

As Internet access is now available in airplanes and hospitals, one wonders how long the caching facility would be needed for.

Friday, October 04, 2013

The Death of the Music Publishing Industry

The Death of the Music Publishing Industry

And the Minor Deaths of CDs

 “Heading West” was a popular term within the U.S. long before it became economically possible, then an option and ultimately a fad locally. The West was where any person with empty pockets but a heart full of zest could make his fortune. Be it in the oil fields or be it in a small town at the end of a desert. The town was full of dust and promises, heart-breaks and dreams. It was called, of all unlikely names, Hollywood. Starting off as a privately held estate, it took nearly half a century to convert into a proper township. Aside from the picturesque countryside the reason for major motion picture companies to establish themselves there was quite accidental; they just wanted to move as far away from Edison’s company and its lawyers. Thus, the industry that made dreams come true was born. Each topic was fresh, each face was new. The cellophane they produced was transported across oceans and nations. Careers and legends rose and fell each day.

The biggest fall was with the advent of the “talkies”. At first, the motion pictures that were recorded with sound were ignored by most as naught but a passing fad. What history did with these scoffers is well detailed in The Artist (2012).

The name of Eastman Kodak was synonymous with films; be they those for recording Western actions or for X-Rays. Kodak owns the patents for inventing most of the film industry and equipment. Cutting edge or bleeding edge, if it had anything to do with filming, then Kodak invented it. With camera equipment at its height, Kodak invented a gadget for the “non-serious hobbyist”: a portable digital camera. It was taken as a fad yet again, expected to pass once people got over its novelty value and returned to “serious” cameras. The giant that ruled the business world of cinematography bet its future the wrong way and ended up going into receivership in 2012.

Such is history. Each new invention raises a wake that supports a million jobs and drowns a million others.
It is now the turn of Compact Disks to join the museum shelves along with phonographs, 8-track tapes and cassettes. The curtain is coming down not just on CD’s but publishing houses, distribution companies and so on down the line.

The why of it is no mystery but is clear to everyone. CDs were invented for data storage initially but quickly shifted to audio storage for the simple reason that they offered a medium that was smaller, could store more information and reproduce music at higher quality than others. Being a change of recording mediums only, it was taken in stride by music recording companies as well as equipment manufacturers who simply phased out, or dumped, phonograph records and cassette tapes.

Their business, on the whole, remained the same: milking the success of music artists and performers worldwide. The recording companies, not the artists, profited from every sale. A complete book can be written (and many have been) on the one-sided economics of the music industry. Singer after singer worked unceasingly to produce enough material to fill one CD before any money could be made. And then, all the money went into the coffers of the companies while the singers got some fame and more bills to pay.

In this totally unfair world of music – which was/is exactly as fair as rest of life – a singer could not sell the best of songs till he or she could supply enough new songs to fill a CD and in the meantime the song was blared out from radio stations and televisions.

Suddenly, people noticed the winds of change had blown yet again when they hadn’t been paying attention and they had pockets full of digital audio recording and playing equipment to which music could be downloaded at the press of a button. “Where did this come from?”, they wondered. Then went on to hook these matchbook-sized players to their computers to download the latest songs, audio-books, cooking recipes, and so on. If you stand near any bourse, you would be able to hear the stock of recording companies crashing.

Each singer, each cook, each story-teller now has the option of distributing single contents directly. They can completely cut out the old middle man, though in favor of new ones, but with added benefits of near-zero production costs along with worldwide distribution.

The audience slash target market is near incalculable as the numbers keep increasing before any estimate may be collated. At a guesstimate, there are going to be 800 million Android and 300 million iPhones in use by the end of this year. In simple words, every 1 person alive out of 7 in the world has a smart phone. Add to that iPods and its ilk of MP3 players and you are looking at a lot of hardware.

Apple’s iTunes Store alone sold 25 billion songs. JLo’s On The Floor had 1.4 million downloads from iTunes Stores alone within a couple of months of being released, so she probably made enough money to buy a fully custom-designed airliner.

There is a huge Winchester hard drive on display at the Smithsonian. It could store an amazing 2K bytes of data. They’ll probably display the CD next to it.


Sunday, January 13, 2013

Traversing firewalls

Networking today is almost always a matter of running a cable from one end to the other, plugging it in to routers and letting the DHCP server take care of everything. Microsoft has spoiled everyone by giving them a GUI to handle all the hard work. The nitty gritty of such tasks doesn't effect Mac users because, firstly, has anyone ever heard of a Mac Server, and, secondly, the Mac users just don't care.

Just kidding, you Mac people, I know you take exception to such remarks. Now cool down, and go back to your iTunes.

Anyway, to get back to the networking post. The topic of today is how to connect across firewalls.
Let me layout the scenario for you. There is the simple case where you have servers behind a firewall. Like so:
In the above layout, all the servers and the accessing laptop run one or other version of Microsoft Windows. Why? Cuz I says so. Seriously, because that's the setup that I have. My laptop runs Windows 7 and all the servers run Windows Server 2003.

Also note, that in the above diagram I have left out a lot of IP addresses. That is because they are not relevant for the purposes of today's post; you are just concerned with the ones mentioned.

To connect to any of the servers, you simply run the VPN setup wizard and let things take care of themselves. Now you can access the servers using Windows Remote Desktop (RDP), Microsoft Team Viewer, Web browser or whatever. Pretty standard and you probably do this on a regular basis. Nothing to it and hardly worth mentioning.

I mention this just to layout the scenario for a more complicated setup.

In my case, a number of 3rd parties allow me access to their servers but only from my servers using a site-to-site VPN tunnel. To make life more complicated, they have security in place (using Cisco Access Control Lists) to only allow connection from selected IP's. That means that only my "Server 1" and "Server 2" can access those partner servers. Below is what the scenario is:

As you can see, there is a second firewall between my laptop and the target server. All the partner servers run one or other version of Linux (RedHat, CentOS, etc). To work on these Linux servers, I need to get an SSH session. Also, ftp for that matter, but having the ability for one automatically gives the other.

What I was doing previously was a real kludge: I would first use Windows Remote Desktop to get a session on my "Server 1" then from there use PuTTY and ftp to work on the Linux servers.

There are two major disadvantages to this way of working.

Firstly, this puts a lot of load on my communications line as the RDP GUI session is being replicated here on a pixel-by-pixel basis. Also, to transfer files in either direction, I had to ftp them first to "Server 1" and then to the destination. A real time waster.

The second, and worse, problem was that a number of my colleagues had to work on the Linux servers at the same time and RDP has a limit on the number of connections that a Windows Server allows. Hence, a lot of my time was spent in listening to complaints how people could not work as they could not find an available session.

SSH to the rescue!

A bit of Googling and posting to newsgroups introduced me the tunneling capabilities of PuTTY. A special thanks to Angel Gonzalez of openssh-unix-d mailing list.

PuTTY can set up a tunnel along with an SSH session that allows forwarding of port requests to the SSH server. Now, this is the important bit so be sure to read it carefully: What PuTTY can do is tell the server to forward local requests. i.e. you start a session where you specify a local port but PuTTY takes that requests and sends it to the SSH server to do the forwarding. A sort of double-forward configuration. This confused me at the start as I was trying to do it in one go. The trick is to configure this in two steps.

First of all, I needed a SSH server on my W2K3 "Server 1". I got that from Source Forge. Setting it up is a snap. Run the installer then read the quick start guide. Believe me, you can't make it work without reading it. A good example is given here.

Test is by running PuTTY on your laptop and opening an SSH session to your server. In my case, to 192.168.0.1. One problem that I had was that the cursor keys didn't work. Another was that if I ran any program that opens its own window, I couldn't see it. (Well, of course!). Not to worry, we don't really need to bother with either.

Time for the configuration.

What we need to do is tell PuTTY to open an SSH connection plus set up the port forwarding. To do this, follow these screen snapshots.

Configuring the SSH session:

The IP given here is that of "Server 1". Port 22 is the standard SSH port.

Configuring the tunnel / port forwarding:

You would need to add multiple entries here. The port number is any free port on your local computer. What this is doing is double-forwarding port 5001 requests to the local computer to port 22 of the destination. Add an entry for each of the remote servers. You don't specify the W2K3 server's IP here as it is already mentioned on the first screen. Now go back to the first screen and save your session.

The thing to remember is that the tunneling rules do not initiate first but after the SSH session is established.

Configure the remote session:
Open a fresh copy of PuTTY and use these values on the first screen only. Do not enter anything on any other screen.

Save this entry for "Remote 1". Make new fresh entries for other remote servers.

Connecting:
Now, connect the first session that you made for "Server 1". You will get a login prompt for the W2K3. Enter your user id and password and you will see a DOS prompt. Ignore it. Minimize the window. What we have done is opened an SSH session that we don't need to have the tunnels running that we do need.

Finally, connect the SSH sessions for the second entry, the one with 127.0.0.1 as the host name and you'll get a prompt for the Linux login. Tada!

Hopefully, from the above, you'll understand that the same thing will work for ftp as well.