Nov 3, 2013

Huawei BM622 BlankWAN Infinity 2013

Today a friend of a friend of mine came to my house bringing me three (3) units of Huawei BM622 4G WiMAX wireless modem telling me that he wanted to let me repair his 3 CPE's which is defective. I took one of the modem plug the AC/DC power adapter then turn ON the power button as its finish the system initializing the 3 WiMAX LED indicator still ON obviously the device is already BlankWAN.


To confirm it I plugged the  Cat5 cable to the PC likewise to the modem port, the GUI (graphical user interface) is accessible with default username and password such as user and admin I can navigate all the menu and configure the settings but the WAN MAC is totally Blank.

I immediately removed all 4 screws and open it, get my SMD Aeulos 850C hot air and removed the STMicro M25P32 SPI flash memory I know this is just an easy task just read, erase, check if blank and upload the firmware reprogram/reflashed that's it, then plugged again the Chips done. I was surprised since I have a spare of ten (10) M25P32 SPI Flash memory I have tested it to the BM622 all it appears to be empty at all, it all happen to be BlankWAN the ten Chips on the said defective 4G WiMAX modem.

I also took the other second Huawei BM622 modem and do same process hoping that this is different from the first one, I have come out with the conclusion to be the same including the third BM622 brought by the friend of my friend.

With further experimentation to pinpoint where really the error that cause the STMicro M25P32 to be empty or totally causing to be "BlankWAN Infinity" upon plugging the Chips on the CPE, I have one unit that is fully working and shifted the the two Chips the flash memory S29GL064 and M25P32 to the defective unit the result gives me the same BlankWAN. 

On the other hand, last experiment I did is that I took the S29GL064 from the defective and plugged this Chips to the good working BM622  the result of my test did not give me BlankWAN. Inshort, the thing that causing the "BlankWAN Inifinity" is the SoC (system on chips) its not on the two flash memory e.i. the Spansion S29GL064/STMirco M29W640 and M25P32.

For this solution, I am still collecting some information that will help this BlankWAN Infinity problem to be solve.

The Best AMD Micro Procerssor

When choosing a new computer, which specification is the most important one to look out for? Of course, it has to be the computer processor used. The performance of a laptop or desktop computer will be largely decided by the processing speed of the computer chip embedded in its motherboard. If you are assembling a computer, every piece of computer hardware needs to be smartly chosen. As far as choosing the best processor for laptop or desktop computers is concerned, there are only two brands to choose from - AMD & Intel. Both manufacturers have rolled out a range of chips, with varying features, to suit the requirements of different users.

Choosing the Best Computer Processor

If you are buying a desktop or laptop computer on your own, for the first time, you need some guidance on what features to look out for in computer processors. The most important indicator of processing speed of any AMD or Intel chip is its clocking frequency. Higher the clocking frequency of the processor, specified in Hertz, faster is its operation.

It is the age of multicore processors and more the number of cores, better is the chip at multitasking. AMD processors have two, three and four cores, which enhance their productivity and efficiency. Other than that, if you really want to go into details, check out the processor cache size. Greater the capacity of the processor cache to store data, faster is the chip.

Also check out if the processor has an integrated graphic processing unit (GPU), which will enhance video processing speed. When choosing the best gaming chip, make sure that you go for top of the line processors which score high on all the parameters, which I discussed before. In the next section, you will find a list of the best processors in the market. The list will make it easier for you to choose the best computer processor from the AMD line.

Best AMD Processor For the Money

Which would be the best processor for your computer, will entirely depend on its application area. AMD produces a huge range of processors ranging from those used in computer servers, to desktops and laptops. Here are the chosen five.

AMD Phenom II X6 1090T Black Edition
Undoubtedly the best processor for gaming and general usage is the AMD Phenom II X6, which comes with 6 built in cores and boasts of a maximum clocking frequency, which touches 3.2 GHz. It is the best processor for gaming, launched from the AMD stable. The lowest price tag on this phenomenal computer chip is $229.99.

AMD Phenom II X4 965 Black Edition
The second in line to the throne is the AMD Phenom II X4 965 chip, with a clocking frequency of 3.4 GHz. This quad core chip, is the best processor for laptop, launched by AMD, for the mid price range. Its minimum listed price in the market is $159.99. If given a choice, I would personally choose this one.

AMD Phenom II X4 955 Black Edition
The third best is another quad core processor, named AMD Phenom II X4 955. It provides a clocking speed of 3.2 GHz and is ideal for desktop computer usage. This one will cost you only $144.99.

AMD Phenom II X4 940 Deneb Black Edition
Another good choice AMD Phenom II X4 940 is ideal for a desktop or laptop with basic applications. With a price tag of $162.99, it is quire reasonably priced and ideal for general purpose applications.

In terms of performance, Intel has surpassed AMD long ago. However, the USP of AMD processors is their price range, which is substantially lower than their Intel counterparts. Choose whichever of the above AMD processors which exactly suits your requirements. For more details on each of these processors, I suggest that you visit AMD's official website.

Understanding Between SCSI, SATA and IDE

As the world witnessed the evolution of the 'personal computer' or PC, computers went from being large special machines to personal devices that made everything easy for the user, with the help of input devices. These devices were nothing but the interface between the user and his personal computer, but these interface/peripheral devices too needed a bridge between themselves and the internal system. This is what led to the development of setups like SCSI, SATA and IDE.

Let's have a look at the features and differences between the three.

SCSI

History
Around the '70s, as the need for a physical and logical interface between peripheral devices and computers arose, Shugart Associates came up with SASI (Shugart Associates System Interface), an interfacing device that acted as a bridge between the hard drive and the computer. This 50-pin flat ribbon connector was commercially sold as SCSI-I.


Popularly known as scuzzy, this interfacing standard was supported by many electronics and hardware industry leaders of the time. Various versions of SCSI have been released since, and though it is considered more or less outdated nowadays, some low-end personal computers still use them because of their low cost and ATA HDD standards.

Build and Working
The very first SCSI used a 50-pin flat ribbon connector. The SCSI boards were typically the size of a hard disk, and were physically mounted. While the earlier SCSIs were parallel interfaces, the current SCSI interfaces perform communication via serial communication. Serial SCSIs offer faster transmission as compared to the parallel ones.

SCSI interfaces can be either mounted on the motherboard physically or can be implemented using plug-in adapters. Although SATA has virtually isolated any chances of modern computer systems coming with SCSI provisions, separate interfaces for motherboards that do not support SCSIs are still available in the market, though, it's not long before eventually they would be phased out too.

Storage
SCSI allows up to 7 - 15 (depending on the width of the bus) devices to be connected. This makes it possible for all the devices to be on the same board rather than buying a different board for different devices. That increases costs as well as the complexity inside the CPU.

Speed
Current SCSIs can transfer up to 80 megabytes/second, and support Fiber Cable, IEEE and SSP. Modern SCSI devices are backward compatible, i.e. if an older device is connected, SCSI will still support it although the transmission speed may get affected.

Price and Utility
SCSI historically has been a burden to the pocket. Its newer versions don't make it any more comfortable. Considering that there are at least 10 different (3 new age) varieties of SCSIs in the market right now, choosing the right one and hoping that SCSI standards don't get washed out anytime soon is quite a task in itself. What works in SCSI's favor though is that it supports a variety of devices right from dot-matrix printers to scanners, plotters, and modern keyboards and mice.

IDE

History
IDE was developed by Western Digital Electronics in association with Control Data Corporation and Compaq Computers, and was launched in 1986. It has evolved a lot since then. By the mid '90s, IDE-supported ATA drives had almost eclipsed SCSI-supported devices. Western Digital's embedded hard drives with the IDE platform were such a huge rage, that not only did they wipe out competition but also the obsolete SCSI model of device interfacing. IDE popularly came to be known as PATA, for their parallel style of data transfer.


Build and Working
IDE originally had 40-pin and 80-ribbon cables. While some of these are still in use, modern IDEs with 28 pins are found in most machines. PATA transfers 16 bits at a time, and obligatorily works on a plug-and-play basis.

Round parallel ATA cables made an appearance in the late '90s, when assembled computers and modified CPU cases had become a regular feature. These cables offered flexibility as well as better cooling effects.

Storage
PATA allows the connection of two devices per channel.

Speed
The most recent versions allow 133 mebibytes/second in burst mode.

Price and Utility
PATA, as a successor to SCSI, had been extremely successful, owing to its low price and better value for money credentials. PATA interfaces are still used majorly in large industrial setups.

SATA

History
Serial ATA was created around the turn of the century, to replace PATA (IDE). In 2003, SATA was launched amidst much fanfare, and with a decade, has captured 98% of the market share in personal computers. SATA, which was originally launched as a 1.5 Gbits/second interface, in its modern-est version can transfer up to 6 Gbits/second.


Build and Working
SATA offers serial connectivity and hot plugging. Hot plugging is a functional facility by which computer components can be replaced without having to shut down the system. SATA carries an important feature - Advanced Host Controller Interface (AHCI), which is an open source interface.

A SATA data cable has 9 pins and is typically about a meter in length. This makes it easier for SATA cables to fit in small devices and offer better air cooling. Connectors are straight and angled, although angled ones are widely used since they allow for lower profile connections. A mini SATA connector is used for small storage drives, and an e-SATA connector is used for connecting external devices.

Storage
A SATA cable allows only one connection per connector. This makes it undesirable in large industrial setups, where more hardware for less cost is the norm. A breakthrough innovation is the Universal Storage Module, which supports cable-less peripherals and devices.

Speed
SATA was launched as a 1.5Gbits/second interface. Eventual versions of SATA have supported 3 Gbits/second and 6 Gbits/second of transfer speed.

Price and Utility
SATA devices are the least expensive peripherals around in the market. Bi-directional PATA to SATA adapters are available if you want to avoid being bombed by the cost-detonator that PATA comes with.

Comparing the three interfaces clearly gives us an idea that most personal systems today use SATA. IDE is expensive and has been successfully superseded by SATA. SCSI has almost become obsolete and may well become history a few years down the line. SATA is the future until a more convenient and cheap option makes an appearance. Its low price and universal appeal only adds to its score. What cannot be denied though, is that, SCSI, IDE and SATA have changed the way we use computers and related devices today.

The Best Antivirus For Android Smartphone

With the development of 3G and 4G cell phone technologies, the newly evolving cell phone platform is a hot bed for development of new applications. With the rise of Google's Android operating system, Linux has extended its dominion to this new platform. Though it inherits the security features of Linux, Android may not be completely immune to viruses directed at it. With the growing use of this mobile operating system, it's essential that there be safeguards in the form of antivirus software, which could be installed on these phones.


Since a large number of users now prefer accessing the Internet, checking email and browsing the web via smart phones, they are bound to be vulnerable to hacking attacks. Recognizing the rise of smartphones as the new computing platform, many companies involved in writing antivirus software for computers, have directed their resources towards creation of antivirus software for Android and other mobile operating systems.

Choosing the Best Antivirus Software For Android

What are the features that one should look out for when choosing an antivirus software for a phone? One of the prime factors is its ability to identify and clean computer viruses and other malware programs. It should have a regularly updated database of known virus threats and be able to scan all the vital Android system files for infections on demand. Real time scanning and checking of mobile apps for viruses are some of the other features to look out for. Theft protection features are an added bonus. The program should make it easy to locate the phone in case of theft and protect data stored on it. Besides this, scanning of SMS service content for malware is a feature that you should look out for.

Reviews

Considering that you may share very sensitive personal information including credit card details when engaging in ecommerce and shopping through your Android phone, it's essential that you have an antivirus software, which can provide an effective bulwark against hacking threats. Here are some free and paid programs of note.

AVG Antivirus Free
One of the best programs for Android platform is AVG Antivirus Free. It offers a thorough scan of all SMS, email, apps and storage devices, which are a part of the Android phone. It comes with malware scan functions, along with theft protection that can help you protect your data, in case your phone gets stolen. SMS spam protection is also provided. Just like the computer version, this free AVG application for Android is a very reliable alternative.

SMobile Security Shield
Another contender for the title of best antivirus for Android is SMobile Security shield, which you can purchase for $29.99. This is a custom designed program for Android operating system which includes an antivirus program, a backup feature and anti theft facility, which can let you wipe out all important data.

Super Security Standard
One of the free antivirus programs for Android phones is Super Security standard edition. With a real time and on demand scanning option, it provides all the basic security features that you need.

AVG Antivirus Pro
The advanced antivirus protection option provided by AVG is Antivirus Pro. This includes all the features of the free version, along with more features. This includes strong anti-SMS spam protection and on demand technical support when needed. It will cost you $9.99 to get it installed.

While there is little chance of encountering virus problems on Android phones, it's best that you install one of these programs. I would personally recommend AVG antivirus Pro for Android, which is backed by the solid reputation of AVG as one of the best antivirus programs on the computing platform.

List of Best Dual Core Android Phones

For people who live by their smartphones for whatever reasons, there are times when the average single core processor won't be enough. Whether its for work, file share, socializing or gaming, the rate at which all the apps in the world are evolving demands for a faster smartphone. So here are the best upcoming dual core smartphones that you can choose from.


Top 5 Dual Core Android Smartphones

They are big, they are powerful and they are a must have for anyone who lives on the go. Take your pick from among the best of the best.

Motorola Droid Bionic 4G

Features:
  • Android 2.3.4
  • 4G LTE Networks
  • 1 GB RAM, OMAP4 Dual-core 1GHz processor
  • 1735 mAh battery
  • 4.3 qHD 540×960 TFT LCD
  • 16GB internal storage memory RAM, microSD
  • 8-megapixel camera
The Droid may win the race based on its battery power. The problem with dual core processors is that they use up too much power and the slimmer your phone, the weaker your battery and the shorter time for which your phone remains usable. But the Droid offers 10h 40min of talk time on one full charging. As far as competition with the iPhone 5 is concerned, rest assured that the Droid's 4G LTE Network is faster than what the iPhone can come up with, including the AT&Ts HSPA+.

Samsung I9100 Galaxy S II

Features:
  • Android 2.3 Gingerbread
  • 1.2 GHz dual-core ARM Cortex A9 processor
  • 16GB or 32GB of storage, micro SD apart from 8GB of phone memory
  • 1GB of RAM
  • 4.3 inches SUPER AMOLED Plus touchscreen of 480 x 800-pixel resolution
  • 1080p video recording at 30fps
  • Standard battery, Li-Ion 1650 mAh
There are only two things that can bother you about the S II. The first one is the price. The original reports stated the price of the phone to be around $800, which later settled to be around $599-$699 for the 16GB version. The other problem is the low resolution. But it really isn't that big of a deal as the colors the phone has are vibrant enough to nullify any pixellation. Also, the phone lacks a hardware camera button and you need to remove the battery when you want to take out the SIM card. Apart from that, the phone is regarded as the perfect dual core smartphone for anyone. If you can afford it, you will be a very happy Android user.

Motorola Photon 4G

Features:
  • Android 2.3.3
  • 1GHz NVIDIA Tegra 2 Dual-Core processor
  • 48GB combined storage capability
  • 1GB of RAM
  • 540 x 960 pixel resolution and 4.3 qHD display
  • 8 Megapixel camera
  • Standard battery, Li-Po 1700 mAh
The Photon will sell mostly because of its international GSM capability (via Sprint), making it a very convenient option for people who travel countries a lot. The other plus point for the Photon is its superb web browsing capability that actually manages to make surfing on the phone a fun thing to do.

HTC EVO 3D

Features:
  • 1.2GHz dual-core Qualcomm 8660 Snapdragon CPU
  • 4.3-inch qHD 960 x 540-pixel Autostereoscopic 3D display
  • dual rear 5-megapixel cameras with dedicated camera button, 1.3-megapixel front-facing camera
  • HDMI 1.4
  • Standard battery, Li-Ion 1730 mAh
  • 1GB internal storage, 32GB microSD external storage
  • 1GB RAM
The biggest problem with this phone is its battery life. This sort of dampens the joy of having a phone with features too many to count.

HTC Sensation 4G

Features:
  • Android OS, v2.3
  • 1.2 GHz dual-core processor, Adreno 220 GPU, Qualcomm MSM 8260 Snapdragon
  • S-LCD capacitive 540 x 960 pixels, 4.3 qHD
  • 1 GB storage, 8 GB internal with microSD extension up to 32GB
  • 768 MB RAM
  • Standard battery, Li-Ion 1520 mAh
  • 8 Megapixel camera
The phone wins for its 1080p video recording and stereo audio recording capability. The resolution of the screen is quite good too. There have been few minor issues with the phone heating up during usage, but nothing too serious.

More Phones

Those are the best in the business of dual core Android smartphones. There are other phones that are good too, but I found the above phones much better compared to them.
  • Motorola Atrix 4G: The phone runs on Android version 2.2 Froyo (upgradable to 2.3.4 Gingerbread), the Tegra 2 AP20H chipset and comes with configurable camera settings.
  • LG Thrill 4G: It comes with a 5 Megapixel camera and a 4 inch capacitive touchscreen.
  • LG G2X: Slower than others due to its 526MB RAM, runs on Android 2.2 but will soon be ready for 2.3.
All phones are available in the market now. These smartphones are set to redefine what blazing fast on the mobile means. You would be joining many, if you intend to buy one of them and use your older smartphone as a second phone. All you need now is carrier information and you're set to buy your very first dual core smartphone.

Android OS For Personal Computer

Google's answer to Symbian and iOS is Android. It is a Linux avatar, specifically designed for mobile phones and other portable devices like tablet computers. The Linux pedigree means that portability is inherent in this operating system. Very few people in Android fan communities know that Android OS for a PC has been a reality since August, 2009. The developer team of Android launched a live CD for computer users, where they could test the prowess of this new operating system. Now customized distributions of the latest Android operating systems are made available online, for free. The Android-x86 website, hosted by a community of developers provides latest customized builds of Android for computers.


The concept of a live CD lets you test an operating system, in spite of having a fully functional installed operating system on your PC. So people who are thinking of switching to a new operating system, can check it out, before they decide to completely migrate over to it.

If you feel like testing the Android OS, before buying an Android phone, getting a live CD or live USB version and running it, is the best way to test it. Making this live CD or live USB version available was one way of increasing the credibility of the operating system among the masses. That's exactly what the folks at Android-x86 have done for you. If you like the live CD version, you may install it on your netbook, laptop or computer as a standalone or alternative operating system. Let us see how you could go about it.

How to Get Android OS For Your PC?

So how and where can you get Android for PC setup? Like all developers of the Linux open source community, the Android team has made the live CD version, as well as the live USB version available on its official website. Visit the Android-x86.org web page or any other mirror sites to download the setup files. Various images of builds based on the succeeding Android versions (Cupcake, Donut, Eclair, Froyo, Gingerbread, Honeycomb and Ice Cream Sandwich) are available. Some builds are exclusively designed for specific brands of netbook and laptop computers like Asus, Lenovo and Dell. Each ISO file can be used to create a live CD, which also provides the option of hard drive installation.

Know that these versions work only on x86 systems. Read all the instructions regarding download and installation on the website. A glance through the FAQ section will also be very helpful. Let me walk you through the installation.

How to Install and Run Android OS on Your PC?

Once you get the setup files on your computer, the rest is quite easy. There are two ways in which you can run Android on a computer, without installing it on the hard drive. Either you could burn the live CD ISO image on a CD and boot from your optical drive or USB drive or alternatively, you could run it using VMware or VirtualBox. If you like Android, you may go ahead and install it on your hard drive. Here are step-by-step instructions for installing Android x86 versions.

Step 1: Download Latest Build From Android-X86 website
The latest version available, at the time of writing is Android-x86 4.0-RC1. You can get the whole ISO image directly, by downloading it from the main website or any one of the mirror sites. These builds support fat32, ntfs, ext3 and ext2 file systems. You also have the option of running Android on VMware or VirtualBox.

Step 2: Burn Image on a CD or Created Bootable USB
Firstly, you must prepare the Android ISO image for installation. Burn it on a blank CD or install it on a USB using UNetbootin. Now you are ready for installation.

Step 3: Run Installation
To run the installation, simply plug in the USB or put the installation CD into the optical drive and boot up the PC. The installer will provide you with a couple of options. Choose 'Run Andorid Without Installation' (to boot from CD) or 'Install Android to Hard disk' (for a permanent installation). Android can coexist on the hard drive with other operating systems in multiboot mode.

Step 4: Choose Partition and File System Format
On choosing hard drive installation, you will be asked to choose a hard drive partition for installation. You may choose one of the available partitions or create a new one. You may also choose to install it on a removable USB drive. To go for the third option, choose the 'Search Devices' option. Then you must choose whether formatting is to be implemented on the partitions and if yes, choose the type of file system.

Step 5: Install Grub
Lastly, you will be asked whether Grub (the boot chain loader) should be installed. Get it installed if you are going for a multiboot setup. With that done, installation will begin immediately. Within a few minutes (depending on the processing speed of your PC), the installation will be finished. Within a few minutes, you will be greeted with the stunning Android interface, which you can begin using!

As you will realize, the power of Android lies in its multitasking prowess and light use of computing resources. Through Android Linux is reborn as a new mobile incarnation. The future is ANDROID for sure.

The Pros and Cons of Cloud Computing

Cloud computing refers to the use of computing resources, those being hardware and/or software) that reside on a remote machine and are delivered to the end user as a service over a network, with the most prevalent example being the internet. By definition, a user entrusts his data to a remote service, on which has limited to no influence.


When it first appeared as a term and a concept, a lot of critics dismissed it as being the latest tech fad. However, cloud computing managed to cut through the hype and truly shift the paradigm of how IT is done nowadays. The Cloud has achieved cutting costs for enterprises and helping users focus on their core business instead of being obstructed by IT issues. For this reason, it seems that it is here to stay for the immediate future.

Categories of Cloud Computing

There are mainly four models of cloud computing:
  • Infrastructure as a Service (IaaS)
  • Platform as a Service (PaaS)
  • Software as a Service (SaaS)
  • Network as a Service (Naas)
Let’s discuss those in more detail.

Infrastructure as a Service (IaaS): This is the most basic cloud-service model, which provides the user with virtual infrastructure, for example servers and data storage space. Virtualization plays a major role in this mode, by allowing IaaS-cloud providers to supply resources on-demand extracting them from their large pools installed in data centers.

Platform as a Service (PaaS): In this model, cloud providers deliver to the user development environment services where the user can develop and run in-house built applications. The services might include an operating system, a programming language execution environment, databases and web servers.

Software as a Service (SaaS): In this model, the cloud provides the user with access to already developer applications that are running in the cloud. The access is achieved by cloud clients and the cloud users do not manage the infrastructure where the application resides, eliminating with this the way the need to install and run the application on the cloud user’s own computers.

Network as a Service (Naas): The least common model, where the user is provided with network connectivity services, such as VPN and bandwidth on demand.

Advantages of Cloud Computing

Cloud computing offers numerous advantages both to end users and businesses of all sizes. The obvious huge advantage is that you no more have to support the infrastructure or have the knowledge necessary to develop and maintain the infrastructure, development environment or application, as were things up until recently. The burden has been lifted and someone else is taking care of all that. Business are now able to focus on their core business by outsourcing all the hassle of IT infrastructure.

Let’s visit some of the most important advantages of cloud computing and discuss them in more detail. Those will include both a company’s and an end-user’s perspective.

Cost Efficiency

This is the biggest advantage of cloud computing, achieved by the elimination of the investment in stand-alone software or servers. By leveraging cloud’s capabilities, companies can save on licensing fees and at the same time eliminate overhead charges such as the cost of data storage, software updates, management etc.

The cloud is in general available at much cheaper rates than traditional approaches and can significantly lower the overall IT expenses. At the same time, convenient and scalable charging models have emerged (such as one-time-payment and pay-as-you-go), making the cloud even more attractive.

If you want to get more technical and analytical, cloud computing delivers a better cash flow by eliminating the capital expense (CAPEX) associated with developing and maintaining the server infrastructure.

Convenience and continuous availability

Public clouds offer services that are available wherever the end user might be located. This approach enables easy access to information and accommodates the needs of users in different time zones and geographic locations. As a side benefit, collaboration booms since it is now easier than ever to access, view and modify shared documents and files.

Moreover, service uptime is in most cases guaranteed, providing in that way continuous availability of resources. The various cloud vendors typically use multiple servers for maximum redundancy. In case of system failure, alternative instances are automatically spawned on other machines.

Backup and Recovery

The process of backing up and recovering data is simplified since those now reside on the cloud and not on a physical device. The various cloud providers offer reliable and flexible backup/recovery solutions. In some cases, the cloud itself is used solely as a backup repository of the data located in local computers.

Cloud is environmentally friendly

The cloud is in general more efficient than the typical IT infrastructure and It takes fewer resources to compute, thus saving energy. For example, when servers are not used, the infrastructure normally scales down, freeing up resources and consuming less power. At any moment, only the resources that are truly needed are consumed by the system.

Resiliency and Redundancy

A cloud deployment is usually built on a robust architecture thus providing resiliency and redundancy to its users. The cloud offers automatic failover between hardware platforms out of the box, while disaster recovery services are also often included.

Scalability and Performance

Scalability is a built-in feature for cloud deployments. Cloud instances are deployed automatically only when needed and as a result, you pay only for the applications and data storage you need. Hand in hand, also comes elasticity, since clouds can be scaled to meet your changing IT system demands.

Regarding performance, the systems utilize distributed architectures which offer excellent speed of computations. Again, it is the provider’s responsibility to ensure that your services run on cutting edge machinery. Instances can be added instantly for improved performance and customers have access to the total resources of the cloud’s core hardware via their dashboards.

Quick deployment and ease of integration

A cloud system can be up and running in a very short period, making quick deployment a key benefit. On the same aspect, the introduction of a new user in the system happens instantaneously, eliminating waiting periods.

Furthermore, software integration occurs automatically and organically in cloud installations. A business is allowed to choose the services and applications that best suit their preferences, while there is minimum effort in customizing and integrating those applications.

Increased Storage Capacity

The cloud can accommodate and store much more data compared to a personal computer and in a way offers almost unlimited storage capacity. It eliminates worries about running out of storage space and at the same time It spares businesses the need to upgrade their computer hardware, further reducing the overall IT cost.

Device Diversity and Location Independence

Cloud computing services can be accessed via a plethora of electronic devices that are able to have access to the internet. These devices include not only the traditional PCs, but also smartphones, tablets etc. With the cloud, the “Bring your own device” (BYOD) policy can be easily adopted, permitting employees to bring personally owned mobile devices to their workplace.

An end-user might decide not only which device to use, but also where to access the service from. There is no limitation of place and medium. We can access our applications and data anywhere in the world, making this method very attractive to people. Cloud computing is in that way especially appealing to international companies as it offers the flexibility for its employees to access company files wherever they are.

Smaller learning curve

Cloud applications usually entail smaller learning curves since people are quietly used to them. Users find it easier to adopt them and come up to speed much faster. Main examples of this are applications like GMail and Google Docs.

Disadvantages of Cloud Computing

As made clear from the above, cloud computing is a tool that offers enormous benefits to its adopters. However, being a tool, it also comes with its set of problems and inefficiencies. Let’s address the most significant ones.

Security and privacy in the Cloud

Security is the biggest concern when it comes to cloud computing. By leveraging a remote cloud based infrastructure, a company essentially gives away private data and information, things that might be sensitive and confidential. It is then up to the cloud service provider to manage, protect and retain them, thus the provider’s reliability is very critical. A company’s existence might be put in jeopardy, so all possible alternatives should be explored before a decision. On the same note, even end users might feel uncomfortable surrendering their data to a third party.

Similarly, privacy in the cloud is another huge issue. Companies and users have to trust their cloud service vendors that they will protect their data from unauthorized users. The various stories of data loss and password leakage in the media does not help to reassure some of the most concerned users.

Dependency and vendor lock-in

One of the major disadvantages of cloud computing is the implicit dependency on the provider. This is what the industry calls “vendor lock-in” since it is difficult, and sometimes impossible, to migrate from a provider once you have rolled with him. If a user wishes to switch to some other provider, then it can be really painful and cumbersome to transfer huge data from the old provider to the new one. This is another reason why you should carefully and thoroughly contemplate all options when picking a vendor.

Technical Difficulties and Downtime

Certainly the smaller business will enjoy not having to deal with the daily technical issues and will prefer handing those to an established IT company, however you should keep in mind that all systems might face dysfunctions from time to time. Outage and downtime is possible even to the best cloud service providers, as the past has shown.

Additionally, you should remember that the whole setup is dependent on internet access, thus any network or connectivity problems will render the setup useless. As a minor detail, also keep in mind that it might take several minutes for the cloud to detect a server fault and launch a new instance from an image snapshot.

Limited control and flexibility

Since the applications and services run on remote, third party virtual environments, companies and users have limited control over the function and execution of the hardware and software. Moreover, since remote software is being used, it usually lacks the features of an application running locally.

Increased Vulnerability

Related to the security and privacy mentioned before, note that cloud based solutions are exposed on the public internet and are thus a more vulnerable target for malicious users and hackers. Nothing on the Internet is completely secure and even the biggest players suffer from serious attacks and security breaches. Due to the interdependency of the system, If there is a compromise one one of the machines that data is stored, there might be a leakage of personal information to the world.

Conclusion

Despite its disadvantages and the fact that it is still in an infant age, cloud computing remains strong and has great potential for the future. Its user base grows constantly and more big players are attracted to it, offering better and more fine tuned services and solutions. We can only hope that the advantages will further grow and the disadvantages will be mitigated, since cloud computing seems to have made IT a little bit easier. Happy cloud computing!

Understanding How Cloud Computing Works

The Internet has changed the way we communicate and brought a paradigm shift in the way information travels around the world. In the past, nature of the Internet was that of a repository of information or a gigantic library with global open access. Now the Internet is taking the next step in evolution with cloud computing.


Cloud Computing In a Nutshell

Cloud computing is the provision of all computing resources through the medium of the Internet. It is about taking the client server architecture to an extreme, in terms of scaling and resource sharing. A computer or any mobile device connected to a cloud network will have all its data, along with all the processing programs hosted on a remote server farm.

It will access data and use the resources on the cloud servers, from any terminal connected to them. Just like power is supplied to any region in a power grid, according to demand, so are the resources allocated to terminals in a cloud, based on demand. The 'cloud' in cloud computing essentially stands for the Internet.

Instead of installing programs on your own computer, you can access the same programs as web service applications that are made accessible via a web browser. So, any user can access and save his work or data on the cloud computing server. This provides users with the freedom to access their work from any terminal they choose to work on, which can make a remote connection with the cloud.

Cloud computing became a reality because of the increasing high speed connectivity provided by the Internet to remote locations. The power of Internet connectivity has been harnessed to share resources on a cloud. Google and Amazon are some of the main players in this business and they are aggressively promoting this concept across the globe.

Applications

There are several applications of cloud computing. Private clouds grant limited access to a private network of computers. Amazon and Google have set up cloud services for private use. The reason for the popularity of cloud computing network is the economics of it. Instead of purchasing proprietary software and bearing the cost of maintenance, companies can save the cost involved by opting for a cloud service instead.

Cloud is thus the next step in evolution of computing and the Internet where its power is being harnessed not to just share information, but also services. The software development technology that has made this software resource sharing possible is the concept of an 'Application Programming Interface' that makes interaction between programs from different platforms possible.

You have already seen the power of cloud computing if you have used Google Docs feature. It is a way of making optimum use of resources. Many companies are preferring a cloud solution, as it reduces costs and provides the most optimized computing solution. With time, private clouds will be the preferred mode of resource sharing for most businesses around the world.

Anonymous hacks Singapore: The Straits Times newspaper (video)

Activist group Anonymous has hacked a Singapore newspaper website over internet freedom in the city-state, where government agencies are now reportedly on alert for wider cyber attacks.

The website of the pro-government Straits Times was hacked early in the day by apparent members of the group, which opposes recently introduced licensing rules for news websites in Singapore on censorship grounds.


The attackers, using the name 'Messiah', took over the blog of a Straits Times journalist, saying she had distorted 'our words and intentions' in a report on the group's threat a day earlier to 'wage war' on the Singapore government.

'We oppose any form of internet censorship among other things,' said a post on the journalist's hacked blog, which is part of the newspaper's website and has been taken offline.

The hackers urged the journalist to apologise within 48 hours 'to the citizens of Singapore for trying to mislead them'.


If she fails to apologise, 'then we expect her resignation', the hacker said in the hacked account, still visible in online caches.

'If those demands are met we will be on our way. But in the event our demands are not met in the next 48 hours, we will place you in our to do' list and next time you wont (sic) be let off this easy.'

Asian media giant Singapore Press Holdings, which publishes the newspaper, said: 'We have made a police report, and the police are investigating.'

The attack on the Straits Times followed a post on YouTube on Thursday in which a person claiming to speak for Anonymous warned the group would cause Singapore to suffer financial losses from 'aggressive cyber intrusion'.

Singapore, which has been governed by the same party for 54 years and strictly regulates the traditional media, is Southeast Asia's financial centre and hosts the regional headquarters of many global companies.

Reacting to the YouTube clip, Singapore's Infocomm Development Authority said: 'We are aware of the video, and the police are investigating into the matter.'

The Straits Times, meanwhile, said it had learned government agencies had been put on alert in Singapore following the initial threat on Thursday.

The new rules opposed by the hackers were imposed on June 1 and require annual licensing for news websites with at least 50,000 unique visitors from within Singapore every month.

Websites granted a licence must remove 'prohibited content' such as articles that undermine 'racial or religious harmony' within 24 hours of being notified by Singapore's media regulator.

Nov 2, 2013

Re-programming the 2Wire NAND flash IC

Californian hacker RyanC suggested another method for unlocking the 2Wires: re-purposing a SmartMedia or xD-Picture card reader to program the NAND flash memory. [1]

The SmartMedia format uses the standard ONFI command set for reading and programming the NAND flash. The xD-Picture specs are slightly more involved, being a superset of ONFI.

Simple, so far?

2Wire, however, has its own flash translation layer (FTL) to hold the logical-to-physical block mapping. This mapping data is stored in the out-of-band (OOB) area of the NAND page. Unfortunately, the average flash card reader cannot program arbitrary data to the OOB area, so can’t be used to reprogram a 2Wire flash. All is not lost though..

Aside the professional NAND programmers costing $2000 or more, there is one consumer-grade NAND controller IC which offers raw read and write access to all areas of the flash device. The IC, codenamed the Alauda, is something of a mystery. No one is even sure who developed it, but it was probably on behalf of Fuji and/or Olympus.

The Alauda IC has a USB peripheral controller to interface very simply with the PC. This allows easy transfer of control messages and page data to the raw NAND device. And it doesn’t matter if the NAND chip is embedded in a camera card, or in a TSOP48 surface mount package, as in the case of the 2Wire.

It was perhaps BrendanU who first publicly documented the capabilities of Alauda-based card readers. [2] An open source kernel driver was then developed for the Alauda by legendary Linux hacker Daniel Drake.[3] Cory1492, a well-known XBox and PSP hacker, ported Daniel’s code, and built it against the userspace USB library, libusb.[4] Cory’s efforts have made the tool available for most Unix platforms and even for Microsoft Windows.

Alauda NAND flash controller harnessed to TSOP48 cradle 256Mbit NAND from 2Wire board loose beneath
The Alauda NAND controller IC

The plan to exploit this hack and hardware was described earlier. Briefly:
  • Gently lift the NAND flash IC off the PCB with a hot-air gun;
  • Dump contents with a NAND reader. For reasons above, the Alauda IC is ideal;
  • Rewrite “initd” XML table to re-enable secure shell daemon. See: http://pastebin.com/ss8sqMdu
  • Rewrite “user” XML table with new root password. See: http://pastebin.com/gucCEM3H
  • Update ECC in OOB areas of all modified pages. See: http://hack.error-correcting-code-ecc/
  • Re-program the modified NAND pages;
  • Re-install NAND IC on the 2Wire PCB;
  • Fingers crossed and boot!
Userspace tool for Alauda NAND reader by Cory1492
UPDATE:

This method was just trialled several times. While the NAND reading and writing works fine, the 2Wire board still won’t boot with our modified firmware image. The device just hangs with a solid red LED.

The search for that elusive 2Wire hack continues!

As for the NAND hack in general, it could be very useful in a range of other applications. Whether for unlocking routers, digital TV set-top boxes, or for reflashing PC BIOS chips, etc.

EDIT:

There are some more notes in the comments below. In the dueness of time, it can be properly documented and referenced. The beauty of this NAND reader is that it costs scarcely $10 to make.

How to Setup a Bridge Long Distance WiFi

A WiFi Bridge can link your network to another network so that resources like Internet can be shared. Bridging devices work together in pairs so you will need two units. One unit is placed at each network. When a WiFi connection is established between both bridging units then the two networks become one.

A WiFi bridge is different from a WiFi router because it is able to connect two networks using WiFi. A normal WiFi router must connect to other networks using an Ethernet cable.


Equipment used in Illustration
  • WiFi Bridge (2)
  • Point-to-Point WiFi Antenna (2)
  • Low Loss Coax Cable (2)
  • Connector Fitting (2)
  • Router (any)
  • Modem (any)
  • Ethernet Cable (any)
Setup

It is best to initially setup both bridging units in the same room before relocating them to their final location. Run the setup CD on a nearby computer and follow the instructions. Once both bridges are communicating with each other then you can continue by placing each item in its final location.

Most WiFi bridging devices come pre-installed with a small antenna that can be upgraded to a larger antenna for extended long range WiFi. When using a bridge it is best to mount your antenna outdoors where line of sight can be achieved without obstructions. In this case you may need to extend low-loss coaxial cable between the antenna jack on the bridge and the outdoor antenna.

Boosting Power

If all your equipment is setup and aligned properly and you are still not getting connected then you may need to boost the power. This requires another piece of equipment called a WiFi Signal Booster. This WiFi Signal Booster has two coaxial connectors so it can be placed in-line with the antenna. Connect the "Input" jack on the signal booster to the antenna jack on the bridge. Next connect the "Antenna" jack on the signal booster to the coaxial cable that leads to your outdoor antenna. If you're using the linked signal booster above with the equipment used in this illustration, then you will also need a special connector fitting along with a special pigtail. For even more power add a WiFi signal booster to both bridges.

FCC Power Output Rules

Unfortunately there are power restrictions (laws) when using WiFi that if exceeded could land you in jail. The FCC limits your total power output using a sliding scale. The scale starts at 30dBm of amplification power while using a 6dBi directional antenna. Then for every 1dBm you drop in amplification power you can increase the power of your directional antenna by 3dBi.

Using a larger point-to-point antenna, your beam pattern will cover less area and cause less interference for others. This is why the FCC allows this sliding scale.

There is now an easier and more affective way to create a WiFi Bridge by getting the WiFi Bridge Kit from C. Crane.

Oct 30, 2013

World's deepest underwater railway tunnel opens 150 years after a sultan first imagined it

An underwater railway tunnel is now open between the eastern and western parts of Istanbul. The tunnel is the world's first to connect two continents: traveling under the waters of the Bosphorus strait, it joins the Asian and European halves of Turkey's largest city together. It's also the world's deepest underwater railway tunnel of its type, according to Turkish officials, sitting 190 feet (58 meters) below the surface of the Bosphorus.


"The tunnel connects Asia and Europe under the Bosphorus"

The BBC reports the project was first thought up by an Ottoman sultan in the 1860s, but received more timely backing from current prime minister Recep Tayyip Erdogan. Work on the project started in 2004, but was delayed by archaeological digs after the remains of a Byzantine fleet was discovered in the area. The railway — named "Marmaray" for the nearby sea of Marmara, and capable of carrying 75,000 people per hour in both directions — was finally inaugurated yesterday, on the 90th anniversary of the Turkish republic's creation.


The tunnel is 8.5 miles long, but the distance under the Bosphorus itself is fairly short: only 0.8 miles. It was completed with help from Japan, who sent engineers to the country, and added $1 billion to the project's $4 billion budget. Previously, the Bosphorus could only be traversed by ferry, or on one of two bridges. The AFP news agency reports that 2 million people — in a city of 16 million — cross those bridges each day, leading to terrible congestion. Istanbul's mayor, Kadir Topbas, said the new tunnel will "soothe" that congestion.

"Construction was delayed by an ancient submerged Byzantine fleet"

But the project has also come under fire inside Turkey. The country sits on a fault line, and the tunnel doesn't have an electronic earthquake warning system. The Guardian quotes Rıza Behçet — an engineer who worked on the project — as saying he "would not get on the Marmaray metro line, and nobody else should either." Other complaints have been aimed at Erdogan directly. The prime minister was once mayor of Istanbul, and his far-reaching development plans for the city — including a third airport, a third bridge over the Bosphorus, and a second tunnel — have faced protests in the past: most notably in June of this year when police violently dispersed protesters attempting to stop the urbanization of one of Istanbul's few parks.

Huawei is 3rd in global smartphone shipments, LG grows, Apple slows, Samsung leads

A new report by research and consulting firm Strategy Analytics outlined global smartphone shipments for Q3 2013. Total smartphone shipments reached a record 251 million last quarter, a rise of 45 percent overall. Here are some of the firm’s findings:

Huawei

China’s Huawei smartphone shipments grew 67 percent since this time last year to 12.7 million units. That makes Huawei the third-largest smartphone manufacturer in the world, confirming IDC’s findings in Q4 of last year.

Huawei might still be hitting hurdles from government regulators (most recently in Taiwan), but the company is steadily expanding its global reach.

Samsung

Samsung (KRX:005935) leads the pack with a 35 percent share of all smartphone shipments worldwide. It grew 55 percent annually and shipped 88.4 million smartphones, increasing its lead from 33 percent.

Apple

Apple (AAPL) sits in second place, but only grew half as much as the industry average. It shipped 33.8 million iPhones, but only grew 26 percent. Apple just released a decidedly weaker earnings report today.

LG

LG (KRX:066570; LSE:LGLD) actually grew the fastest of any smartphone maker, putting it in fourth place, according to the report. The South Korean smartphone maker has been expanding rapidly in Europe, growing 71 percent overall.

Lenovo

Lenovo (0992.HK) took fifth place with 10.8 million smartphones shipped and a four percent market share. Even with Huawei and Lenovo’s combined market shares, Chinese smartphone makers still trail far behind Apple and Samsung.

(Source: Herald Online)

Progskeet 1.2 Testing Nand/Nor PS3

After a long while with progskeet 1.2 and trying to dump a NAND YLOD PS3, I can confirm that Progskeet 1.2 is completely working for NAND ps3's ..

Yesterday at D3M irc .. I talked with sir bailey and though it would be useful to make a detailed thread about what I did..

Setting up before using anything..
  1. Install progskeet 1.2 drivers for windows xp "drivers_winusb_111121"
  2. Install injectus drivers that comes with "InfectusProgrammer-3.9.9.0"
  3. Pogram the progskeet 1.2 with the newest bitstream "130415_2019 (NAND) & 130412_1647 (NOR)" the NAND one.
  4. Download the newest winskeet "WinSkeet40000_130425_2056 for 1.2"
Now I had everything set up and I was ready to test :D


After testing on a TSOP-48 DIP adapter i had notice the NAND would be only recognized when connecting pin 12 on a 3.3v pin on skeet and pin 13 on ground from one side of the nand .. the other side of the nand wich has pin 37 3.3v and pin 36 GND must be connected to another "3.3v and GND" points on progskeet !


Then and only then the progskeet 1.2 would recognize my Samsung K9F1G08U0A-PIB0 chip as ECF1801540 when clicking AUTO button on winskeet.


Using TSOP48 socket for NAND/NOR Chips jumpered to Progskeet 1.2


Flashing and Dumping with correct chip ID, NOR tested. @ DigiProg , Helpful tips for NAND setup to be noted as follows:

- zif socket TSOP48 with awg30 wires
- Vcc and Gnd taken from ProgSkeet 1.2
- Short and shielded flat 15-pin cable
- Short usb cable
- R7 closed and R8 open as default
- Bitstream 1223 or 2019 (2019 is a little bit faster, about 5/6 seconds, almost identical)
- Winskeet, latest stable release, v111205

Samsung K9F1G08U0A-PIB0 is recognized as ECF1801540 Samsung, while Samsung K9F1G08U0B-PIB0 is recognized as ECF1009540 Samsung.

Oct 28, 2013

Intel Takes on Mini ITX

Intel prepares its new mini ITX motherboard offering

Intel is poised to take on VIA’s mini ITX motherboards with its upcoming Little Valley D201GLY, which is part of its Intel Desktop Board Essential Series. The new D201GLY is the first Intel branded motherboard to feature a SiS chipset. Intel previously used ATI chipsets in its D103GGV and D102GGC2 budget motherboards, however, Intel kicked ATI to the curb after the acquisition by AMD.

The upcoming Intel D201GLY motherboard is an all-in-one solution with an integrated processor, similar to VIA’s EPIA series. It features the SiS662 north bridge paired with the SiS964L south bridge. The SiS662 features integrated SiS Mirage 1 graphics. The integrated graphics core is AGP8x-based and features hardware accelerated DVD decoding. It is not Vista Premium ready.

Intel integrates a Yonah-based Celeron processor on the D201GLY. The integrated processor is a Celeron 215 in a BGA 479 package. Intel clocks the Celeron 215 at 1.33 GHz on a 533 MHz front-side bus. It also has 512KB of L2 cache. Despite being Yonah-based, the Celeron 215 is a single-core processor.

Other notable features of the Intel D201GLY include DDR2-533/400 MHz memory support, one PCI slot, 10/100 Ethernet, six USB 2.0 ports and optional S-Video output.

Expect Intel to release the D201GLY with the integrated Celeron 215 in the end-of-May or early-June timeframe.

Oct 27, 2013

How-To Create Linux BackTrack USB Flash Drive on Windows

How to Make a BackTrack Linux Flash Drive using Windows. BackTrack is a Live Linux distribution based on SLAX that is focused purely on penetration testing. Distributed by remote-exploit.org, BackTrack is the successor to Auditor. It comes prepackaged with security tools including network analyzers, password crackers, wireless tools and fuzzers. Although originally designed to Boot from a CD or DVD, BackTrack contains USB installation scripts that make portable installation to a USB device a snap. In the following tutorial, we cover the process of installing BackTrack to a USB flash drive from within a working Windows environment.


Distribution Home Page: http://www.backtrack-linux.org

Minimum Flash Drive Capacity: 2GB+

Persistent Feature: Yes (Backtrack 4)

USB Ultimate Boot CD (UBCD) prerequisites:
  • Universal USB Installer (does the USB conversion)
  • BackTrack ISO
  • 2GB+ USB flash drive (fat32 formatted)
  • A windows host PC to perform the build
How to install BackTrack to a USB device from Windows:
  1. Download and launch our Universal USB Installer, select Backtrack and follow the onscreen instructions
  2. Reboot and set your BIOS or Boot Menu to Boot from the USB device and proceed to boot
Note: Once Backtrack has loaded, you must type startx at the prompt, to start the graphical X environment.

If all went well, you should now be running from your very own Portable Backtrack on USB!

USB Extender Kit 60m - USB Over LAN Long Distance Extender up to 60m

Specification

Compliant: USB Specification 1.1
Connecters: USB Type A Male RJ45
Power mode: USB Bus powered (no external PSU required)
Housing: Retail pack
OS Support: 2000, XP, Vista, Windows 7
Contents: USB Extender Adapter Kit

Product Information

This USB Extender Adapter kit allows you to run a device up to 60m distance away from a PC using Cat5e Lan Cables.

This USB Extender Adapter Kit comes with two extender boxes and all you need to get started is a spare USB Port and a CAT5e (patch) network cable.

Its ideal as USB cables can be extended up to 5m before loosing signal quality and this avoids the loss as it is extended using standard lan cables.

This USB Extender Adapter Kit will allow you to have your USB cameras, printers, webcams, keyboard/mouse, USB to Serials or any other USB device exactly where you want it without having to move your PC/Laptop.

Extends the transmission distance of the USB device of up to 60m via Cat5e cable

This USB Extender Adapter Kit is easy to install and use and although only USB 1.1 Specification, it sill provides a great affordable and convenient way of sorting out remote/distance connection problems that would have otherwise meant moving your devices or PC/Laptop closer or buying expensive wireless extenders to do a similar job.

Application

World's Fastest USB 3.0 IP

Here you will see demonstrated the fastest USB 3.0 IP in the Universe*. Or at least the fastest published numbers that isn’t marketing hype.

This demo shows SuperSpeed USB 3.0 effective throughput:
  1. SuperSpeed USB 3.0 can really move data.
  2. Synopsys USB 3.0 IP can really move data.
The demonstration includes our USB 3.0 xHCI Host Controller, USB 3.0 Device Controller, and USB 3.0 PHYs.

You have to actually watch the video to get see the effective throughput.

First, I have to say that that is about the most awesome thumbnail picture of me yet. Thank you YouTube!


Second, we optimized the PC systems as follows:
  • RAM Drive on the Mass Storage Device side – This is a lot faster than a flash drive, an HDD, or an SSD. There’s no SATA or PCIe for the data to pass through, so there is zero latency from an additional protocol. The RAM is right next to the USB controller so there is basically zero read/write latency.
  • Windows 7 with an MCCI USB 3.0 xHCI Host Stack – Somehow MCCI engineered this so it’s faster than stacks we’ve seen packed with off-the-shelf Host cards.
  • Nothing else is running on the USB bus or PCIe bus on the PC. Very little is running on the PC.
  • Standard PCs built with standard parts with SSDs (which aren’t really necessary but we wanted to make sure)
  • Our IP – Our USB 3.0 PHY IP, Our USB 3.0 Host IP, and Our USB 3.0 Device IP.

SteamOS could really help desktop Linux adoption, says Torvalds

The Linux desktop revolution is just around the corner!

This is a familiar refrain that has received new life in recent months thanks to Valve and its efforts to turn Linux into a gaming platform with the Steam client for Linux (shown above) and the Linux-based SteamOS.


Even Lars Gustavsson, the chief game maker for DICE, which is the EA-owned studio responsible for the Battlefield series, has a strong interest in Linux for games.

There’s so much Linux love in the air that it prompted Linus Torvalds, overlord of the Linux Kernel, to tentatively suggest that Valve’s announcements could encourage Linux adoption on desktop PCs. Screech! Not again, I hear you say?

Yes, we’ve heard the claim for years that the Linux (or GNU/Linux depending on your persuasion) desktop revolution is just around the corner. And yes, this could be just another high hope in a long history of high hopes, but Torvalds reinforced some important arguments about a Steam-powered rise for Linux.

“I think [the Steam announcements are] an opportunity to maybe really help the desktop,” Torvalds said recently during LinuxCon + CloudOpen Europe in Edinburgh, Scotland. That’s not exactly a ringing endorsement for a Linux revolution but, if anyone is familiar with endless promises of Linux-based desktops becoming popular, it’s Torvalds.

For Torvalds, Valve’s Steam efforts could be a big opportunity to drive desktop Linux because it could force the various desktop Linux distributions to standardize their technology. Torvalds said earlier in the 44-minute talk (shown below) that the Linux desktop was a “morass of infighting.” (The Steam talk starts around 29:50 minutes for those who want to fast foward.)


Critics’ reasoning

A criticism often leveled at Linux OS distributions—and contributing projects such as the Gnome desktop—is that each component insists on doing things its way, or going in a different direction, or breaking compatibility. This can result in fights over everything from the best bootloader to which desktop UI is superior (ridiculous since everybody knows Unity rocks).

Some critics, such as Gnome project founder Miguel de Icaza, put at least some of the blame at the feet of Torvalds. Regardless of who’s at fault, most critics agree that the Linux desktop is a house divided right now, which is why a company like Valve and the success of Steam is so sorely needed.

“[Valve] is this one company who has this vision for how to do things,” Torvalds said. “I think it also forces the different distributions to realize ‘hey, if this is the way Steam is going, we need to the same thing. Because we want people to be able to play games on our platform too.’”

Having everyone toe the line for popular products such as Steam for Linux is an excellent way to set technology standards, Torvalds argues. “Good standards are people doing things,” Torvalds said. “And saying ‘this is how we do it’ and being successful enough to drive the market.”

Change in the air?

Already, Valve’s appears to be influencing how major hardware vendors approach Linux. Shortly after SteamOS was announced, both AMD and Nvidia announced improved driver support for Linux. And AMD’s low-level Mantle support could result in more top-tier games landing on Linux.

But technology is only half the battle. As DICE’s Gustavsson said, it will also take that one killer app to really push Linux as a PC platform. That one game that everyone must play, but the only way to play it will be on a Linux distribution.

Will that game come from Valve in the coming months? An early look at Half-Life 3 perhaps? Only time will tell. But hey, if you’re waiting for the Linux desktop revolution to happen you’ve got nothing but time.

[via PCPro]

Top 7 Best Linux Distributions 2013

Back in 2010 Linux.com published a list of the year's top Linux distributions, and the popularity of the topic made it an instant annual tradition.

There have been several shifts and shakeups on the lists presented since then, of course, and – as you'll soon see – this year's offering holds true to that pattern. In fact, I think it's safe to say that the past year has seen so much upheaval in the desktop world – particularly where desktop environments are concerned – that 2013's list could come as a surprise to some.

Let me hasten to note that the evaluations made here are nothing if not subjective. There also is no such thing as the “one best” Linux distro for anything; in fact, much of the beauty of Linux is its diversity and the fact that it can be tweaked and customized for virtually any taste or purpose. The one best Linux for you, in other words, is the flavor you choose for your purpose and preference and then tweak until it feels just right.

Still, I think some Linux flavors stand out these days as leaders for particular use cases. I'm going to diverge a bit from past lists here when it comes to those categories, however. Specifically, where past lists have included the category “Best Linux LiveCD,” I think that's become almost obsolete given not just the general shift to USBs -- some PCs don't even come with CD drives anymore, in fact -- but also the fact that most any Linux distro can be formatted into bootable form.

On the other hand, with the arrival of Steam for Linux, I think this year has brought the need for a new category: Best Linux for Gaming.

Read on, then, for a rundown of some of the best of what the Linux world has to offer.

Best Desktop Distribution

There are so many excellent contenders for desktop Linux this year that it's become a more difficult choice than ever – and that's really saying something.

Canonical's Ubuntu has made great strides in advancing Linux's visibility in the public eye, of course, while Linux Mint and Fedora are both also very strong choices. Regarding Ubuntu, however, a number of issues have come up over the past year or so, including the inclusion of online shopping results in searches – an addition Richard Stallman and the EFF have called “spyware.”

At the same time, the upheaval caused by the introduction of mobile-inspired desktops such as Unity and GNOME 3 continues unabated, spurring the launch of more classically minded new desktops such as MATE and Cinnamon along with brand-new distros.

For best desktop Linux distro, I have to go with Fuduntu, one of this new breed of up-and-comers. Originally based on Fedora but later forked, Fuduntu offers a classic GNOME 2 interface – developed for the desktop, not for mobile devices -- and generally seems to get everything right.

Besides delivering the classic desktop so many Linux users have made clear that they prefer, Fuduntu enjoys all the advantages of being a rolling release distribution, and its repository includes key packages such as Netflix and Steam. I've been using it for months now and haven't seen a single reason to switch.

Best Laptop Distribution

At the risk of sounding repetitive, I have to go with Fuduntu for best Linux distro as well. In fact, the distro is optimized for mobile computing on laptops and netbooks, including tools to help achieve maximum battery life when untethered. Users can see battery life improvements of 30 percent or more over other Linux distributions, the distro's developers say.

Such optimizations combined with this solid and classic distro make for a winner on portable devices as well.

Best Enterprise Desktop Linux

The enterprise is one context in which I have to agree with recent years' evaluations, and that includes the enterprise desktop.

While SUSE Linux Enterprise Desktop is surely RHEL's primary competitor, I think Red Hat Enterprise Linux is the clear leader in this area, with just the right combination of security, interoperability, productivity applications and management features.

Best Enterprise Server Linux

It's a similar situation on the server. While there's no denying SUSE Linux Enterprise Server has its advantages, Red Hat is pushing ahead in exciting new ways. Particularly notable about Red Hat this year, for example, is its new focus on Big Data and the hybrid cloud, bringing a fresh new world of possibilities to its customers.

Best Security-Enhanced Distribution

Security, of course, is one of the areas in which Linux really stands out from its proprietary competitors, due not just to the nature of Linux itself but also to the availability of several security-focused Linux distributions.

Lightweight Portable Security is one relatively new contender that emerged back in 2011, and BackBox is another popular Ubuntu-based contender, but I still have to give my vote to BackTrack Linux, the heavyweight in this area whose penetration testing framework is used by the security community all over the world. Others surely have their advantages, but BackTrack is still the one to beat.


Best Multimedia Distribution

Ubuntu Studio has often been named the best distro for multimedia purposes in Linux.com's lists, but it's by no means the only contender. ZevenOS, for instance, is an interesting BeOS-flavored contender that came out with a major update last year.

For sheer power and nimble performance, though, this year's nod goes to Arch Linux. With an active community and thousands of software packages available in its repositories, Arch stays out of the way so your PC can focus on the CPU-intensive tasks at hand.

Best Gaming Distribution

Last but certainly not least is the gaming category, which surely represents one of the biggest developments in the Linux world over this past year. While it may not be relevant for enterprise audiences, gaming has long been held up as a key reason many users have stayed with Windows, so Valve's decision to bring its Steam gaming platform to Linux is nothing if not significant.

The Linux distro choice here? That would have to be Ubuntu, which is specifically promoted by the Valve team itself. “Best experienced on Ubuntu” reads the tag line that accompanied the Steam for Linux release last month, in fact. Bottom line: If you're into gaming, Ubuntu Linux is the way to go.