Categories
Technology

What Is the Best Way to Store Digital Photos? The Options Explained

Gone are the days of taking rolls of film to the drugstore to get them developed, waiting a few days for your pictures to come back, and then painstakingly putting all of those prints into photo albums.

Digital cameras and smartphones give us pictures in an instant but what do you do with all of those photos? How many pictures and videos do you have stored on your phone or memory card? Thousands? Tens of thousands?

While you’re probably not printing all of these photos, you do need to figure out a way to store them. From the cloud to a website to computer hardware, there are plenty of different options. Read on to learn more about the best way to store digital photos.

External Hard Drive

An external hard drive is like the one built into your computer, but is in addition to that and is literally external. It’s a separate device that you plug into your computer through a USB port. It’s a great option if you have huge numbers of photos, especially with large file sizes, that you want to store.

You can set your computer to automatically back up all of your photos to your hard drive regularly and organize the hard drive with folders that you select. There are a lot of options out there depending on the size you need, so consider that before purchasing anything.

Backing up files to an external hard drive is typically quicker than uploading them to the cloud, and as an added bonus, doesn’t require wifi or an internet connection.

Cloud Storage

Cloud storage is one of the most popular ways to store your pictures. By uploading your photos to the cloud, you are getting them off of your phone, tablet, or computer and into cloud storage where you can access them whenever you want.

The problem with cloud storage is that the companies who host these services are not guaranteed to always tick around. If you only have your photos on the cloud, and the company folds, you may lose access to your photos or have to quickly transfer them to another storage service.

Most of the free sites also put limits on how many photos you can store. Once you hit that maximum, you will need to pay for a subscription. Some of the best options include Flickr, Amazon Photos, Google Photos, SmugMug, or even iCloud, which is for Apple users.

Like the other storage solutions, there are pros and cons of cloud storage to consider. You don’t have to worry about an extra device or lug around an external hard drive when you are uploading to the cloud, but you do need a wifi connection. Most sites require this and won’t upload files on the cellular network.

If you’re in an area with weak or no wifi, you will have to wait until you have access to the internet to upload. Similarly, if you want to access photos from the cloud, you’ll also need internet access. When uploading files, you may need to convert them to jpegs. Check out this guide on how to change pdf to jpg.

Portable Storage Device

If you don’t have a ton of files to store or need them to be important, a storage device such as a thumb drive or USB flash drive could work. You can also store photos on memory cards in your camera, but these are limited to a certain number of files.

A flash drive gives you the ability to plug it into any device with a USB port to pull up your pictures. You don’t need an internet connection, and you can set your computer to automatically backup your photos to the device any time you upload them.

One major plus to these devices is that they are incredibly cheap. You can find ones that hold a few gigabytes for just a few dollars. You probably have a flash drive laying around your house right now.

Of course, their price point is also a con, as you get what you pay for. They are easy to lose, may simply stop working, or could become damaged to the point that you can’t access the files. Recovering files from a broken thumb drive can cost a pretty penny, negating all of the money you saved by buying an inexpensive device.

Make sure it’s password protected as well, as if anyone finds it, they can access your photos.

Network-Attached Storage (NAS)

A network-attached storage (NAS) box is like a small computer with its own operating system, memory, and processor. You can store your files on your NAS and access them using the internet. Typically, you buy it without storage installed and then choose what size of hard drive you want, buy it, and install it. If you run out of space, you can replace the hard drive with a larger one.

How Do I Choose?

Instead of choosing just one way to store your photos, you should choose multiple solutions so you have plenty of backups. Most experts recommend having at least three different copies of your photos saved so you don’t risk losing access to them.

The 3-2-1 rule is an easy to remember rule of thumb: 3 copies of your data/photos, on 2 different devices (such as an external hard drive or flash drive), and 1 copy in another location (think your cloud, OneDrive, etc.). This ensures that you have many different options of accessing your photos even if one of them fails.

The Best Way to Store Digital Photos Includes Multiple Devices

The best way to store digital photos so you never have to worry about losing them or not being able to access them is to use multiple devices. Using the 3-2-1 rule, take care in backing up your photos so you can enjoy them for years to come.

If you found these tips useful, be sure to check out some of our other articles.

Categories
Technology

How Blockchain is Transforming Healthcare

Blockchain technology is expected to grow steadily by 2025.

The application of blockchain technology has gained widespread use in the healthcare industry. The technology is proving to be highly beneficial to the industry.

In 2012, Estonia began using blockchain to secure healthcare data and process transactions. Now 95% of health information is ledger-based and 99% of all prescription information is digital. The country’s complete healthcare billing is handled on a blockchain.

Blockhain to secure patient data

Healthcare data breaches are common. Between 2009 and 2017, more than 176 million patient records were exposed in data breaches. In the breaches, hackers stole credit and other banking information. Plus, health and genomic testing records were stolen.

The usefulness of blockchain technology lies in its unmatchable characteristics – immutability and transparency. With these characteristics, patient data can be kept secure. The technology will maintain an incorruptible, decentralized, and transparent log of patient data. Further, blockchain is private, it will conceal the identity of any individual with complex and secure codes that can protect the sensitivity of medical data. The decentralized nature of technology will allow patients, doctors, and healthcare providers to the same information swiftly and safely.

Factom, BustIQ, and Medicalchain are a few healthcare start-ups that are solving this challenge.

Blockchain to streamline medical records

Medical information is sensitive. Needless to say, communication is such information that has to be concrete. However, miscommunication between healthcare professionals is a typical problem. It costs the healthcare industry nearly $11 billion.

Further, obtaining a patient’s medical records is time-consuming process and exhausts staff resources and delays patient care. Blockchain-based medical records overcome this challenge and offer a solution for this chronic problem.

The decentralized nature of the technology creates one ecosystem of patient data that can be quickly and efficiently referenced by doctors, hospitals, pharmacists, and anyone else involved in treatment.

Using decentralized-ledger technology (DLT), an ecosystem of patient data can be created. This data can be easily referenced by doctors, nurses, pharmacists, or anyone responsible for the healthcare of the patient.
Robomed and Patientory are a few healthcare start-ups that are building an ecosystem of patient data for hospitals.

Blockchain for medical supply chain and drug traceability

There’s a lack of information related to medicines. People buying medicines don’t whether it came from a legitimate source. In fact, pharmaceutical companies are clueless if substandard medicines are sold in the name of original medicines. For this reason, the supply chain needs to be constantly tracked.

Pharmaceutical supply chain management can benefit from the blockchain technology. Given the decentralized nature of the technology, full transparency can be achieved within the supply chain.

Chronicled and Block Pharma are start-ups that help pharma companies trace their supply chains.

Blockchain for Genomics

The cost of processing a human genome was $1 billion in 2001. Now it costs $1,000. The use of genomics to improve human health is finally becoming a reality. Healthcare start-ups like 23 and Me and Ancestry.com are making DNA tests easy and are unlocking clues to our health and past to millions of homes.

Blockchain fits perfectly in this industry as it can safely house billions of genetic data points. It has developed into a marketplace where people can sell their encrypted genetic information to create a wider database and help scientists to access valuable data faster than ever before.

Research and development are two key areas for the progress of the healthcare industry. Faster research drives new vaccines and medical treatments; while faster logistics ensure speedy provision of healthcare services. In both aspects, the industry has been slow. Blockchain technology presents an opportunity to transform both. Healthcare start-ups are already speeding up this process.

Categories
Technology

Web.com Reviews Examines the Necessary Features Every Website Must Have

Introduction

Making a new website may be an overwhelming experience. According to Web.com Reviews, whether you are making a trendy blog, opinionated journal, or e-commerce website, the options for customization are endless. If you are a first timer it is natural that things may seem confusing. So, here is a list of absolutely essential features for your website.

The Features

  1. Description of your business should be crisp and clear – Whenever a new user lands on your website, it is essential that they get a clear idea of what your website is all about. Apart from conveying that information, you also need to make a solid first impression. Before you write down the description and its design, think about a few important questions:
  • Who are you?
  • Why does your website exist?
  • What do you want to achieve with this website?

If your description has satisfactory and crisp answers to these questions you are halfway there. It is also a good idea to include a hero message or mission statement that quickly summarizes the purpose of your business.

  1. Treat your URL as your logo – If you are a freelancer who quickly wants to put out their work on the internet or an exciting new business you may be excited to just get your website up and running. However, before that make sure to do your due research before deciding on a URL. Make sure that it is simple enough to remember, but remarkable enough to create your own unique brand identity. It should also be easy to type. So, try to avoid underscores, special characters, or any complicated word. Moreover, always stick with .com. Search engine algorithms and users treat such websites as more credible.

  1. Contact Information – It is an obvious feature. But make sure to include your address, business name, and contact number at your website’s footer. Users who want to do business with you like to scroll down at the bottom for this information. That’s the industry standard location. Apart from the block number, street, and pin code you can also add your address on Online Maps so that users browsing your website through their phone can directly open up your business location on their Google Maps or Apple Maps.

  1. Call to Actions – A website is good for online presence and traffic to that website means you are getting the exposure that you need. However, converting those visitors into leads and actual sales figures makes a website a true success. After a user lands on your website you need to guide them to take the action that you want. It may be a pop up for subscribing to a newsletter, submitting a contact form, or creating an account. You need to craft those calls to actions carefully so that your users are inclined to participate in those actions instead of getting annoyed and leaving your site.

Conclusion

Web.com Reviews believes that any website can’t do without these features and you should incorporate them into your website design before you get it online.

Categories
Technology

WHICH KEY LEGACY SENSORS WILL WORK FOR YOU? EVALUATE YOUR IOT REQUIREMENT

There are several real world cases that make end-to-end IoT applications critical for business. Communication carriers are already seeing the growth of legacy applications that interlink several platforms, components, and devices. They build successful operations with business apps, SIMs, gateways, and cloud services. Sensors hold the key to their agility and functionality. However, an enterprise has to choose ‘key legacy sensors’ ideal for long term functions. And, it is possible by knowing the specific IoT requirement.

This blog aims to offer information on types of sensors used for IoT applications. And with a few examples, you maybe able to recognize and evaluate which could meet your needs.

Audit & Evaluation

Your current business operations provide clues to interconnectivity prerequisites. The module, infrastructure, and network architecture would also demand an upgrade. Before selecting any technical support to build and implement it, a quick audit is essential. This evaluation is the primary step for the transition. Would a battery powered device be better as it has a low consumption rate? What if you need to track mobile assets in remote areas? Connectivity is critical to such activities. It holds significance for telecom companies for extensive network coverage areas. The same is perceived differently in the case of smart cities solutions for mapping public transport during peak hours. Do normal operations require high quality bandwidth? This could also be another determining factor for assessment process. Your business sector and competition are also responsible to get the right solutions. And before we move on to the key sensors, it will be important to know about long term designs for future proofing. Due to disruptive trends that keep popping up it is impossible to use new apps without further complexities as huge costs are involved.

IoT Sensors adoption

Sensors have been in market for a long time now. But with the advent of IoT everything has changed. Automation in businesses and industries produce data and intelligence to improve productivity. With connected devices, sharing crucial data is an integral part of the eco-system. With a combination of key sensors and communication channels, functions have become effective and also ‘smarter.’ The best example is that of a self-driving car that ‘senses’ and picks up information on the way, uploads it and provides a smooth ride. Of course, there are other real world examples too where IoT applications are running operations effectively. As data is processed it adds value to the functions giving advantage to adoption of technology.

Key sensors in the market

The use of sensors is vital for various business operations. The benefits include:

  1. Warning if anything will go wrong. It allows managers to take charge and handle problems before everything comes crashing down.
  2. Maintaining or replacing equipment on time enables continuous workflow management.
  3. Evidence-based decisions can be taken to contain any potential problem.

There are sensors of many types. Take a look at some which fulfill various needs across diverse industries.

As you can see that IoT application development is not just about connecting devices. It is much more than that depending on developing key legacy sensors. This is why auditing and evaluating your needs is important.

What should you consider before choosing the IoT App?

There are 3 important things that matter the most for any app to be successful: Scalability to meet new trends, monitoring for effective delivery and continuous performance. A successful implementation requires strong measures at the planning stage.

Key legacy sensors come in various shapes and sizes. They are valuable to create the perfect gateway to remain connected. While creating the best sensor type, a regime requires integrated and dedicated circuit blocks for better security. An ideal sensor is fitted with advanced tamper detector and a math accelerator. A stronger version will have functions that cannot be cloned or attacked. If there is any criminal activity, it shuts all operations and provides warning sign to users. With careful design a sensor does its job better.

Clear Functions

An app works only if the intended object is able to provide the crucial data to users. For example, the temperature sensor should be able to provide readings for evaluation. A humidity sensor predicts and provides reports for HVAS systems or weather. The infrared sensors take measurements of heat or detect IR emission. Its sensitive nature is also a huge support in the healthcare industry to monitor blood flow and pressure. In the biomedical field the same are used to analyze breath and observe the heart rate. And, did you know that smartphones have strong optical sensors? They are perfect for automobile drivers who need to focus on the road while driving.

Reading the above information on how IoT applications can bring a change in the way you remain interconnected with gadgets and workforce. It will correspondingly encourage you to assess the prerequisites in a structured manner.

Categories
Technology

Smart TV Buying Guide: Know About the Features and Benefits

If you are looking for a new television, chances are that you’ve already made up your mind about buying a smart TV. Almost all TV brands today are focusing on offering smart TVs at affordable prices. While brands like LG, Sony, and Samsung offer premium models that come with a hefty price tag, other brands like Mi, VU, and now, even OnePlus are offering smart TVs that are conveniently priced.

Smart TVs are slowly but surely becoming commonplace in most of the Indian households. Considering the fact that the way we consume content has changed significantly over the past decade, smart TVs offer us the most convenient method of watching our favorite movies and TV shows.

Whether you want to watch the latest movies from OTT platforms such as Netflix or Prime Video or want to cast your favorite games on to your large TV screens, smart TVs let us enjoy visual and audio content like never before.

That said, if you are still going through your checklist about the features you want in your smart TV, here are a few must-have features, along with their benefits.

1. Work on your Smart TV

What makes a smart TV inherently smart is its ability to connect to the internet, which opens up endless possibilities. One of the biggest advantages of owning a smart TV is its ability to surf the web, while also giving you the option of working on your smart TV, instead of working on your smartphone or your laptop! For example, one of the best TVs in India, the Samsung 32-inch (80 cm) Smart HD TV (R4500) gives you the option to turn your smart TV into your personal computer, where you can either work from the cloud, mirror your laptop, or remotely access your office computer.

2. Internet Connectivity

Before smart TVs came along, our TV cabinet consisted of a set-top box, streaming devices, Blu-ray players, DVD players, and a host of other HDMI chords that were plugged into our TV sets. However, smart TVs have made those devices obsolete, and it declutters our entertainment centre. Thanks to smart TVs, we no longer require streaming devices, and can stream content from various OTT platforms, listen to music, or buy/rent the newly-released movies from apps. Thus, smart TVs also act as our one-stop home entertainment systems.

3. Affordability

Couple of years ago when smart TVs weren’t as popular, brands like LG, Sony and Samsung offered premium models which were too expensive for the average consumer. However, today, brands like Mi, VU and OnePlus are offering smart TV models at attractive prices. Some of the best TVs under 15,000 INR belong to these brands, such as the Mi 4A Pro HD Ready LED Smart TV, while even Samsung is now offering 32-inch HD Ready smart TVs for less than Rs. 15,000! Thus, smart TVs today are more affordable, and also come with top-of-the-line features and latest technology, enabling users to stream videos instantly, or play mobile games on large screens.

4. Chromecast and Mirroring

Most of us continue to rely on smartphones and laptops to stream our favourite shows, movies, and matches. However, with a smart TV, you can cast your smartphone or tablet content onto your smart TV’s screen.

For instance, the Mi 32-inch 4A Pro HD Ready Smart TV comes with built-in Chromecast, and you can use Google’s Home app to mirror your smartphone’s screen. It is the best 32 inch smart TV under a very affordable price. You can watch your favourite content from various apps like Netflix and HotStar. This ensures that you have an enhanced viewing and audio experience, while you can also enjoy mobile phone games on a large screen, or talk to your loved ones via video call through your smart TV.

5. Voice-Assisted Services

AI systems now drive our TV’s software platform, which makes it possible for us to communicate with our smart TVs. Voice-assisted services enable us to give commands to our smart TVs through Google Assistant or Alexa, creating an ecosystem of interconnected AI devices.

Categories
Technology

What Are the Advantages of Telemedicine Software?  

There are many barriers to proper medical care. Most people do not know how to stay healthy or lead healthy lifestyles. Indeed, most lack some health literacy. It won’t come as a surprise to find out that a number of people have no clue on the various aspects covered in health literacy definition.

Besides, most patients face constraints such as long distances to the medical centers, lack of reliable transportation, and lack of quality healthcare providers. However, telemedicine tools seem to surpass such barriers, with more patients accessing medical attention even from the comfort of their homes.

Telemedicine software has made it easy for medical caregivers to achieve a lot that could not be possible with the traditional medical care setting. This article focuses on the various benefits of using telemedicine software

Telemedicine increases access to medical care

Some of the limits to medical care are distance and travel time that sets patients and doctors apart. Thus, most patients find it hard to reach the medical centers due to lack of good transport means or the distance to be covered, yet they lack the finances. However, this software has made it possible to overcome geographical barriers. Areas that experience clinician shortages and rural areas can greatly benefit from telemedicine.

It improves the quality of medical care services

Telemedicine has already shown a positive impact in the medical realm. Indeed, there has been improved quality in medical care delivery in most areas. According to stats, there have been 38.5 fewer hospital admissions in most areas that have embraced telemedicine. Besides, patients are now more engaged in their healthcare, and 31 % fewer hospital readmissions have been observed.

Telemedicine cuts down on healthcare costs

First, telemedicine has helped patients to stay away from hospitals while still undergoing treatments. Besides, no one has to be transported to other locations while under treatment. Anyone can use telemedicine to get the medical attention they need. Also, the expenses incurred while caring for patients in the hospital has reduced. Thus, telemedicine cuts down on most costs linked to medical care services.

Telemedicine improves traditional face-to-face medicine

One of the aspects that lead to high-quality patient care is a strong relationship between a doctor and a patient. Also, this kind of relationship reduces the costs incurred in search and usage of medical care services. Thus, telemedicine is really helping the traditional medical care system. In fact, it should not just replace but support it. With this technology, doctors can easily keep offering care and support to patient’s in-person but still offer flexibility and convenience of helping patients remotely for check-ups, education or awareness, follow-up visits, etc.

It boosts provider satisfaction

It is right that most medical caregivers come across challenges in meeting their patients, diagnosing them, and during their treatment, not to mention the follow-up process. In most cases, many medical care providers find it hard to balance their work and family life. But with telemedicine, it is now possible to achieve a lot in the medical field and still have enough time for their family or personal; life.

Conclusion

Telemedicine has many benefits. Such benefits include increased access to medical care, improved quality of medical care services, reduced costs incurred in medical services, and it also impacts the traditional face-to-face medical service delivery. Thus, it would be best if medical practitioners considered using telemedicine software for the facility.

Categories
Technology

Top 5 Mobile Application Development Companies For Customers and Enterprises

The growing demand of mobiles have necessitated almost majority of customers and enterprises globally to seek for app solutions in order to enhance their revenues. However, picking the best from rest of the pack may prove hectic for many. So, here is the list of top 5 companies for your glance that have phenomenally helped infinite startups and giants.

Mobile conquers the world! Of course, no one can deny this fact. Recent surveys and reports point out exactly to the evolution of mobiles and everything related to these compact handheld devices. As mobile and their accessories have become integral part of our most of our lives, the surge in the field of mobile application development doesn’t surprise many! Mobile apps are making giant strides into almost all industry sectors and they have achieved solid responses from consumers as well as enterprises. App development has completely changed humankind and the style of life as apps offer tremendous convenience and sophistication while allowing to access data on the go. As a result, to meet the growing demand, today, there are infinite iPhone and Android mobile apps development companies competing in the space to draw the eyeballs of customers worldwide. Here, I’ve hand-picked the top 5 best mobile application development companies using which deliver high-quality app solutions to customers worldwide.

Kony.com:

Kony with more than 70+ Fortune clients is one of the top-notch application platform and pre-built apps provider. Their specialty is the ability to raise apps that work smoothly across multiple channels and major OSs with perfection. It really has a very good clientele and impressive apps that can amuse most users.

Fueled.com:

When it comes to realizing user desires on apps and meeting their expectations, Fueled.com is not too far away! Their forward-looking thinking and out-of-the-box designing have most customers and enterprises generate massive revenues. If developing mobile app at a quick pace with absolute quality is your agenda, this company can deliver the goods for you.

Intellectsoft.net:

Since its inception at 2007, with the pliable and accessible app development process, Intellectsoft has the ability to revolutionize the mobile world with some top-quality apps. Their value-driven business model and client commitment are other prospects that have helped most enterprisers and users around the world trust this company for end-to-end customer solutions.

Contus.com:

Contus is a fast-growing mobility, web and cloud solution provider to customers worldwide. The popular 4D Application development process for the iOS and Android customers and enterprises is a huge bonus and has the ability to transform revenues from low-to-high with absolute ease. They have assisted countless startups and corporate giants with their idea-led apps for better business performances.

Mubaloo.com:

Mubaloo is a UK-based enterprise and customer app development company with global clientele. With their mobile strategy expert app developers, Mubaloo has raised some incredible apps that have helped in the process of improving the revenue of a lot of enterprises. This company has been awarded National Business Award for their excellent app development ability.

Categories
Technology

A Review of PowerDVD Linux – expensive, but just works

PowerDVD Linux has been around for years. Originally sold only to embedded Linux developers, the software made its way onto the desktop by being included by hardware makers in Dell’s Ubuntu laptops and more recently Asus eeePCs.

The software is now finally available to the general Ubuntu-using public via The Ubuntu Store.

As a rule, we generally don’t like a lot of proprietary software – not for any ethical reason, but because a lot of proprietary Linux apps are crap, with weird installers, no menu entries, EULAs in pop up terminals, and unnecessary requests to reboot.

On the other hand. Totem requires some setup (although it works out of the box if you use Ubuntu derivative Linux Mint) and at its best seems to use incorrect colors, giving a slightly yellowish tinge to the picture. MPlayer has a horrible UI, and frequently has issues with menus, chapters and subtitles.

We took a gamble and purchased on PowerDVD Linux. It’s expensive – $50 US (24 pounds UK) – and we were quite prepared to trash the software as publicly as we could if it failed to live up to expectations – which we expected it to do.

We were pleasantly surprised.

Installation

PowerDVD Linux comes properly packaged for Ubuntu. Click the deb, install it, and click Applications → Sound and Video → PowerDVD Linux to launch the app.

Desktop Integration

The app installs itself as the default player for film DVDs. This means when you pop in a DVD movie, Ubuntu will ask if you’d like to play it with PowerDVD,.and when you browse a DVD that has a movie on it, Nautilus will ask if you’d like to play the film.

The app will also automatically start if you insert a video DVD, resuming the film where you left off.

PowerDVD works with Compiz, but will switch to Metacity when it starts and resume Compiz when it exits. The switch is seamless and you don’t have to do anything.

Asides from the on-screen UI shown above, hitting Esc during a film, or starting the app without a disc in the drive, will bring up the regular UI shown below. The regular UI is simple, and contains various options for tweaking color correction and switching between stereo and Dolby surround. Thankfully, it avoids the ‘chromed-up-car-stereo-ala-WinAmp-1996′ looks that a lot of DVD players have.

Playing Movies

We a wide variety of films and found subtitles, languages, and menus worked fine.

Colors are great out of the box. There’s a color menu for those that would like to tweak, but we didn’t need to.

Rewind and fast forward work well, with 2x to 16x speed We’d like to have seen more frames when playing at 16x speed, however. Also note that since PowerDVD is a licensed DVD Consortium product, it refuses to skip some advertisements and copyright notices if the disc maker has requested to prevent you from doing these things. Most discs aren’t quite so annoying, however.

Reliability

In normal usage, the app was fine. In our torture test, playing a badly scratched DVD had the following results:

Shock After shaking the laptop violently, our disc resumed playing.
Mildly scratched disc Was not noticable.
Very badly scratched disc Stopped playing. We had to hit ESC to return to the main menu, and restart the disc from there.

PowerDVD Linux never crashed on either test.

Overall

As much as we hate to say this, PowerDVD is damn nice. There was no setup required, and it was reliable as all hell. While some distros (eg, Linux Mint) come out of the box with Open-Source DVD players, PowerDVDs colors simply looked a lot better. None of the weird stuff – automatic playing, subtitles, menuing, slow motion – was a problem.

The app simply got out of the way and let us enjoy our film.

At $49 US, it’s damn expensive, but we don’t regret our purchase at all.

Categories
Technology

A light hearted look at the great format wars of our time

Another great format war ended this week, with Blu-Ray to 70% of released films and, according to the Financial Times, HD-DVD left with one major studio. HD-DVD owners are either mildly put out, or happy they can now pick up HD movies for a few dollars as they’re tossed out by major retailers.

We’ve been here before: whether RealPlayer versus Windows Media, BSD versus Linux, or boxers versus briefs. Let’s take a look back at some of the great format wars of our time.

We should first clarify what we mean by losing. Losing a war doesn’t mean the technologies or formats are bad (though sometimes they are). This doesn’t mean the technologies aren’t in use either. It just means they’re no longer the default solution for their area. Every article on the death of ColdFusion was filled with angry developers pointing out new release of ColdFusion as an example of how alive the language was. But ColdFusion developers are scarce, it’s rarely chosen for new projects, and its no longer viewed as a mainstream web development language like PHP. ColdFusion lost the technology war. Deal with it.

Metric versus Imperial

The war to end all wars. Until the 1960’s, imperial rocked the world, and students spent a great deal of time learning to convert fonoobularsers into schmeekulars. Since then, however, everywhere but Myanmar, Libya and, er, the USA have switched to metric, with unfortunate consequences for polar landers everywhere.
Winner: Metric
Loser: Polar landers

Word and WordPerfect
PCs were DOS, GUIs were for children, and WordPerfect was the wordprocessor.But then Windows came out, and due to a sweet combo deal for PC makers shipping MS-DOS, Windows 3 now came with most PCs. Sure, you could run WordPerfect for DOS, but Microsoft had a pretty damn good version of Word for Windows that could read WordPerfect files. Alas for WordPerfect, patenting file formats wasn’t invented back then, so WordPerfect had to stand on its own two rather smelly feet. By the time a decent GUI WordPerfect came out, nobody cared. A few years later, Microsoft bundled WinWord with its other.
Winner: Word

Netscare versus Internet Explorer In the mid 90’s years Netscape was the internet for a lot of people. Microsoft were still touting their (then proprietary, not internet based) MSN as the sensible alternative to the internet.

But then something changed. Bill Gates sent a memo to everyone in Microsoft about a coming Internet tidal wave. So in the late 90s and early noughties, Microsoft did everything they possibly could to better Netscape. Making their browser free, gluing the browser into the Windows file manager and desktop, and pleasing developers by pushing ahead with standards support that ate Netscape’s for breakfast.

Meanwhile, Netscape did everything to annoy it users. Netscape put a ’shop’ button next to ’stop’, and while having Open Sourced its code to create the Mozilla project, Netscape seemed to have missed the whole ‘release early, release often’ tenant of that movement – the company didn’t release a major new version of Netscape for 5 years after Netscape 4 shipped. In a forward war, Netscape apparently forgot to add ‘compete’ to its to-do list. IE market share went from 20% market share in 96 to 80% in 2000.
The winner: Internet Explorer.

RealPlayer and Windows Media Player

In the late 90s, a former Microsoft executive sees the internet coming, and starts creating a first-to-market streaming audio and video system that works on every platform, is free, included with the most popular web browser. The model is simple:charges companies for the authoring tools to create files, and provide a better version of the player to paid customers. Microsoft itself has no idea about the internet at the time and does not respond for years.

So why did RealPlayer lose? They did everything possible to annoy their customers. In 1999, the free player, hidden to this day in Real’s website, was pimped out to spyware companies like Comet Cursor. That same year the world realized the software sent a list of everything each user played back to RealNetworks. Meantime, the free player kept harassing you with popup advertisements, while you were using other software. Muting RealPlayer stopped all other sound applications from playing (a problem that persists to this day). During installation, RealPlayer asked if it could bone your sister, and if you unticked the box, it did it anyway. OK we made that last one up. We think.

Windows Media Player took advantage of Microsoft’s desktop monopoly to get onto everyone’s desktop, and stayed there because it didn’t suck.
Winner: Windows Mobile

DVD-RW and DVD+RW

Sure, there were vast technical differences, but nobody cared. In the early 2000s, most could burn either the DVD Forum backed DVD-RW (supported by Apple and Pioneer) or DVD+RW, supported by HP, Dell and Microsoft. DVD-RW cracked first, converting to Hybrid DVD+/-RW drives a couple of years before everyone else did.
The winner: +RW. But the victory didn’t last long.

Internet Explorer versus Firefox

By 2003 Netscape was dead. Without any good competition (apart from the great-but-still unpopular Opera) Internet Explorer hadn’t been updated in a couple of years, and Microsoft, having no need to attract developers, stopped caring about standards support. Sure, there was Mozilla, but it was an ugly, slow, bloated mess.

In 2003 Firefox was announced to the world with a two page spread in the New York Times. Firefox didn’t look like another web browser. It had a more minimal design: one toolbar, with only the basic navigation controls placed beside the location bar, and, handily tabs underneath.It was snappy and responsive, and the ability to block annoying flashing images and popup advertising were standard features. It also had great standard support.

In late 2006, five years after IE6, Internet Explorer 7 came out. Its has a more minimal design: one toolbar, with only the basic navigation controls placed beside the location bar, and, handily tabs underneath.It’s snappy and responsive, and the ability to block annoying flashing images and popup advertising are standard features. It also has improved standard support.

That wasn’t enough though: from 2004 to the present, Internet Explorer’s market share has been in slow, steady decline. While Firefox may only be around 20% of computer users worldwide, it’s clear which browser is setting the agenda for web browser development.
Winner: Firefox

Handheld Devices: Windows Mobile versus Palm

After Apple’s Newton failed to take off, Palm was first to popularize the PDA. Microsoft saw the opportunity, and came out with, er, Palm-Sized PCs, before getting their ass handed to them on a plate. Later rebranding to Windows CE and then Windows Mobile, Microsoft differentiated from Palm’s strictly-business devices by giving users color screens and multmedia support. The Windows Mobile applications didn’t share much code with their desktop counterparts – editing a document in Word Mobile still removes most of the formatting, and Pocket IE can’t show most websites properly – but the brand recognition were strong enough to sway users to Windows Mobile. Microsoft licensed their OS to all and sundry, and while HP came out with some great iPAQs, the cleverly namd High Tech Computer (now just HTC), emerged from obscurity with a variety of innovate devices to become the most popular maker of PDAs. Palm, meanwhile, did…not much at all.
Winner: Windows Mobile

Web Development: Microsoft Stack vs LAMP

Once upon a time, there was ASP. It was web development for VB coders back when VB was the easy choice for web apps, and merely required a license for a server platform with a spotty security record that needed to be restarted for every security update. But the market leading web server was generally deployed on Unix platforms, which already shipped PHP, and either MySQL or PostgreSQL, and had a better reputation for security and maintenance. Sure, ASP still exists, but it’s now a language for Microsoft-obsessed corporates, and virtually unheard of in the startup space. See for yourself what the current crop of web 2.0 sites use, then visit the Y Combinator news forums and realize nothing’s about to change.
Winner: LAMP

Open Source Desktops: GNOME and KDE

KDE was the older, more polished desktop. GNOME was a weird acronym, whose main features were being brown, having this icon, and changing its internals every five seconds (Enlightenment, Sawfish, Metacity, a plastic banana). GNOME got some traction because Red Hat and Debian didn’t like KDE’s reliance of QT, which would have required proprietary app developers to pay a licensing fee to Trolltech. This eventually changed, but meanwhile GNOME had started to improve – and become the basis for Evolution, one of the first mail clients on any platform to include bayesian filtering and smart folders. The major Open Source web browsers, Mozilla and then Firefox, also used the GTK theme to varying levels.

A few years later, Ubuntu was created from Debian. Fedora was created from Red Hat. Then SuSE brought GNOME boosters Ximian, and switched from GNOME to KDE as default. Now: the most popular Linux desktops, Ubuntu, Fedora and OpenSuSE, use GNOME by default. Meanwhile KDE spends its days convincing people that, er, 4.0 doesn’t mean 4.
Winner: GNOME

Open Source Operating Systems: BSD vs Linux

BSD took Unix, a popular proprietary operating system requiring proprietary hardware, and made it available for free on x86 under an Open Source license. It existed before Linux did, and for the first few years Linux came along, and was technically superior in many ways, with securelevels, dependency based packaging, fantastic firewalling, and good documentation, before these features (or their equivalents) existed for Linux.

The only think it lacked was by design: BSD was licensed to allow anybodyo do pretty much anything they want with it, including take the source code and improve it without releasing the improvements back. Folks loved the license, and took advantage of it – BSD’s networking code and IP utilities ended up in Windows NT, other utilities ended up in NetApp Filers, and a bunch of other embedded firewalls are based on the entire OS.

Today, lots of people use BSD code. They use it in proprietary applications, have no idea who created it, and have no idea that BSD exists as an Open Source general purpose operating system. Meanwhile, the GPL license of Linux attracted more developers, creating better software, in turn attracting more developers. Linux now does everything that was unique to BSD, a lot more it doesn’t.
Winner: BSD. Just kidding.

Windows Media Player and Flash Video

Windows Media Player trounced Real, and seemed unstoppable. Meanwhile, Flash – which, as an essential plugin, also came with Windows, along with the growing-in-popularity Firefox and OSX’s Safari, quietly added Sorenson Sparc video support. Suddenly Flash wasn’t just a plugin for animation and sound, but a video platform.

The timing was great. The web video scene was starting to expand rapidly, now that users finally had the connections to handle it. Flash was easier to update than Windows Media Player, and the quality was great – to the point Apple (who make the less popular Quicktime) sued Sorenson, the codec provider, claiming that Sorenson’s technologies were licensed exclusively to them. Subsequent updates added new and better codecs. The plugin updated itself, and after a couple of years Flash video support was everywhere.

In 2005 Youtube and Revver (and a plethora of wannabes) came along, basing their business models on the new format. In the last year, ever major video site has been based on Flash Video – and none on Windows Media.

Winner: Flash

Disagree? Think we’ve missed a format war? Really, really, like Realplayer, and your sister can date who she wants? Comments below as usual.

Categories
Technology

What’s broken with Unix? How would you fix it? Part I

The question above is the only question on Google Labs Aptitude Test that relates to an Operating System. It’s a good question: it immediately gets rid of fanboys who can’t see anything wrong with any tool they love, and allows people who are passionate and knowledgeable enough about Unix to demonstrate the can either:

  • see Unix’s faults
  • tell Google they prefer Unix in uppercase, just like MULTICS was.

This is part one of a multipart series, published over the next two weeks. This week: text processing.

Part I: Text Processing

There are plenty of Unix gurus who make arguments about separating content from presentation when discussing structured formats (like Tex) over Microsoft Word. The same argument applies to the shell.

  • Commands don’t return structured information. They return text marked as either output or errors. Errors are accompanied by error codes to help determine the type of error. There is a vague convention a few tools follow to determine the type of output into warnings and information using [WW] and [II], but it’s not particularly popular. There is no standard field separator. To find particular data in your output, you’re forced to think about where this data exists in your output. For example, getting interface names and IP addresses out of ifconfig:ifconfig | egrep -o ‘^[a-z0-9]{1,12}|inet addr:[0-9.]+’
  • Each new tool has a separate config file format, that need users to learn it and software to parse it. This places additional burden on tools developers, who often do horrible things like store configuration in their own files, causing data loss when they filed modified by other methods are overwritten. Red Hat did this a lot – system-config-named would overwrite BIND’s own configuration files (does anyone know if Red Hat still do this). Yes it was terrible, especially from a company that ships and is responsible for both the config tools and the app. But it’s quite hard to create a new parser for every new file format.
  • Adding information breaks parsers. For example, it’d be handy for ifconfig to show an ethernet card’s link status and speed, or the child cards used by bonded interfaces. Yet adding this information can break existing scripts that rely on the current presentation – so you have one tool to check the current IP address and the transfer and receive counters, etc. and another tool to check the link status and speed. It’s terribly inefficient.
  • Text editors don’t handle structured information. This is why people complain when a config file is in XML. Yes, it’s horrible to edit, but that’s because vi isn’t built for XML. vi has keyboard macros to delete characters, words, lines and paragraphs. You can’t delete a Samba share declaration, or an Apache virtual host. You can’t copy a share declaration to another share declaration. Or skip to the next tag enclosed value. Instead, you must treat these items as characters, paragraphs, or strings, rather than objects they are.

So how to fix this?

  • Allow users to fetch particular information from output and config files by specifying what they want, rather than its location. So:ifconfig |egrep -o ‘^[a-z0-9]{1,12}|inet addr:[0-9.]+’becomesinterfaces | filter name ipaddress
  • A smart shell can complete ‘name’ and ‘ipaddress’ after a few characters, since it knows all the available fields that interfaces’ returns.This is similar to the object pipelining that Perl 6, Powershell, and Hotwire allow.
  • Allow output to work as input. So I can save the current interface settings of a system to a config file, or another system.
  • Allow editing tools to treat content as content – adding, copying, and deleting deleting sections, variables and values, rather than treating them as a series of characters, lines or paragraphs. Sure, deleting paragraphs makes vi nice and fast. But not as fast as if you could delete rows, columns, and sections.
  • Take advantage of modern displays through user configurable stylesheets. Allow proper highlighting of errors and warnings, or have intermediate records shown in different colors, or have headings shown as headings. This is something Hotwire already does.
  • Allow better reporting (config to documentation) or building (turning documentation to config) by using XML as the base format. This doesn’t mean you ever have to see any actual XML code – remember, editing tools should treat content as content – see GConf and DConf for examples. It should be easy to build a machine from an word processing document specifying the machines configuration. Or build a simple document from a hosts next time someone asks for a report of some kind.
  • Mandatory documentation of all settings as part of the schema. This allows all kinds of exciting possibilities, such as contextual help – see a definition of what you’re editing. Gconf and the upcoming DConf do this too.
  • A hierarchical structure, for simple organization and other exciting concepts like settings mounts, to allow easy sharing of configuration between machines – again, see Gconf.

That’s it for this week. Have your own thoughts on a designing a better shell? Played with Perl 6, Powershell or Hotwire? What do you think of GConf? Comments are below as usual.

Tune in next week for part II.