Skip to content

Data Storage trends in 2015!

Enterprise world is evolving and is constantly under pressure to provide the highest performance and the most cost effective IT services. As a result, information is playing an important role and has become a valuable asset for any organization. This is mainly due to the fact that organizations are quickly learning to store, secure, access and analyze data, while also managing it in a secure and cost-effective way. Therefore, all the lined-up factors are making a difference between a business success and failure.

In the present business environment, organizations will continue to demand fool proof IT infrastructures so that it will allow them to rapidly and efficiently deliver quality services and manage the data effectively. Organizations are also picking up a trend of increasingly adopting flash and hybrid arrays, and more private, public and hybrid cloud storage options, more than ever before.

Since, enterprise storage is seeing plenty of innovation enabling greater speed and flexibility, let’s discuss some of the popular ones which have a great future ahead-

Cloud Storage- Cloud storage business is all set to experience a golden era in the coming year 2015. It is estimated that the business is expected to touch $7 billion in the upcoming year and will reach $100 billion by 2020. As the cloud providers are promising to reduce infrastructure and management costs while improving accessibility of the user data, the future of this business vertical looks promising.

Hybrid Storage- Hybrid Storage is a term used to describe storage systems that are designed with a blend of flash-SSD and hard disk drives. These blended media solutions help in providing high performance storage resources at affordable price, as they help to address the dollar/IOPS; Dollar/GB value that IT professionals are grappling with. Additionally, these storage systems will also answer on how to best leverage flash by using flash as large caching area by automatically moving the most active data into memory-based storage.

Flash Storage- Flash storage uses electricity, but has no mechanical parts. It typically consumes less power and reads more than one hundred times faster than traditional mechanical hard drives. Thus, businesses intensive applications have found flash storage to be more efficient and cost effective. Most flash storage systems are composed of a memory unit and an access controller. The memory unit is used to store data. As interest in flash storage is growing enormously day by day, vendors are coming up with more innovations, in order to cater to the demand. They are not only improving the read/write capabilities of flash, but are also serving the product at a very less cost.

Unified Storage and Server (USS) - From the past couple of years, the concept of Unified SAN Storage is on high demand. Enterprise users can get the services of a server and storage in one easy to manage appliance. So, the demand for the USS concept is said to be in a raise in the coming year, as it offers greater flexibility, availability and performance than a direct-attached storage.

NAS and iSCSI SAN combination- With the demand for file level storage and block level storage going in parallel, data storage vendors are coming up with appliances which can be either used as a NAS or iSCSI appliance or even both. That means, organizations can use a single appliance to serve as a file server and a block level server and storage or a combination of both. Therefore, in coming years, the demand for such servers is said to explode.

Hyper-converged Infrastructure storage- As time is going forward; enterprises are in a look out for solutions which can support virtualization and automation. This is where the demand for a software-centric architecture is rising, where integration of compute, storage, networking and virtualization resources and other technologies from scratch can be obtained in a commodity hardware box supported by a single vendor. Getting into the basics, Hyper Convergence grew out of the concept of converged infrastructure. Under the converged infrastructure approach, a vendor provides a pre-configured bundle of hardware and software in a single chassis with the goal of minimizing compatibility issues and to simplify management. A hyper-converged system differs from a converged system in its tighter integration of the software components, which may expand beyond compute, storage, networking and virtualization to also include other technologies.

Data Center PUE and DCiE benchmarking in layman’s terms

Companies and Organizations need IT equipment to provide products and services, handle enterprise transactions, provide security, and to run and grow their business. The larger the company/organization grows, the greater will be the need to house their computer equipment in a secure environment. Here, the IT equipment in reference will include computer servers, hubs, routers, wiring patch panels and other network equipment.

Depending on the size of the organization, secure environment is called a wiring closet, a computer room, a server room, or a data center. In addition to the energy need to run the IT equipment, electric power is utilized for lights, security, backup power, and climate control to maintain temperature and humidity levels that will minimize down time due to heat issues.

Therefore, by benchmarking PUE or DCiE, you are comparing the power needed for business critical IT along with the power keeping that IT equipment alive and protected.

It is evident that IT equipment generates heat and that could cause downtime if it reaches more than the permissible levels. And so in a room filled with racks of computers and other IT equipment, heat should be tamed with cooling equipment and so energy costs are incurred with the use of specialized data center cooling and power equipment deployed to keep your servers and other IT equipment up and running.

Data Center is large complex environments which often has different strategic teams managing key components. One team will focus on the facilities management and the other will be focusing on IT equipment deployed in the facility. In these environments facilities managers usually determine infrastructure environmental issues including power, cooling and airflow, and IT managers will have to sort out issues related to critical IT systems such as servers and networking equipment.

Consequently, as a holistic approach, the computer room or a data center’s energy consumption is a key first step in being able to determine the appropriate steps necessary to improve the energy efficiency. In order to reduce power consumption and related energy costs, benchmarking data center’s energy efficiency is also vital. This will help in understanding the current level of efficiency in a data center, and will lay a foundation pegged with best practices for additional efficiency. Moreover, it also helps in gauging the effectiveness of those efficiency efforts.

Power Usage Effectiveness (PUE) and its reciprocal Data Center Infrastructure Efficiency (DCiE) are widely accepted benchmarking standards proposed by the Green Grid. Though, many IT teams from data centers, do not accept it as a standard, it has been helping IT professionals determine how energy efficient a data center is and helps in monitoring the impact of their efficiency efforts. There is also Corporate Average data Center Efficiency -shortly called as CADE, which is being suggested by Uptime Institute. In the year 2009, at the technical forum, the Green Grid introduced new benchmarks named Data Center Productivity (DCP) and Data Center Energy Productivity (DCeP) which probe into the useful work produced by a data center. But the later versions are still not widely accepted as PUE. But they have their own value, and when used correctly, they can be valuable and can prove as an essential tool for improving data center energy efficiency.

Getting a bit procedural, data center PUE/DCiE are efficiency benchmarks comparing your data centers infrastructure to your existing IT load. They actually set a testing framework for the facility to repeat and yield efficiency score and sets related to it. Thus with the help of these tools, data center managers can keep a track on their efficiency efforts to full core. At any given time, they are comparing the power which is currently being used for the IT equipment and company needed with the power used by the infrastructure which keeps that IT equipment cooled, powered, backed-up and protected.

For those, who need an example in detail, here’s it-

PUE example- Let us suppose you are having a facility that uses 100,000 KW of total power of which 80,000KW is being used to power your IT equipment. That will give a PUE figure of 1.25 and that is obtained by dividing 100,000KW of total facility power divided by 80,000KW of IT power.

DCiE example- For suppose you are having a facility that uses 100,000 KW of total power of which 80,000KW is being used to power your IT equipment. That will give a DCiE figure of 0.8 and that is obtained by dividing 80,000KW of IT power divided by 100,000KW of total IT facility power.

Yeah, you are right. PUE/DCiE is only a start on your path to efficiency. For this benchmark to be meaningful it should be generated on a regular basis and preferably also be noted on different days of the week and at different times of a day. Also, your goal should be to take actionable efficiency actions based on your actual data. By comparing your starting benchmark with benchmarks taken after implanting changes, you should be able to see a noticeable improvement graph in your PUE/DCiE.

Warning: Around 150,000 Hikvision DVRs vulnerable to remote wiping of surveillance footage

Hikvision Digital Technology, a leading supplier of video surveillance products and solutions is in news for wrong reasons these days. From the past one year or so, this China based company is in news for offering networked products which are vulnerable to hackers and online criminals.

Now, the latest news is also not at all appealing and might infact turn into a concern for Hikvision sales team.

Rapid7 Labs discovered three vulnerabilities in Hikvision DVRs that an attacker/hacker could remotely exploit to take control of the device. The security company has also disclosed that around 150,000 DVRs of the said company are vulnerable for remote shutdown by hackers.

Rapid7 Lab officials say that after Hikvision failed for months to respond to this disclosure, they are being forced to reveal about the vulnerability on a public platform. The security company disclosed that almost all the DVRs of Hikvision are exposed to Metaspoilt Exploit module. And the fact with this exploit is that no authentication is required to exploit this vulnerability and the Metaspolit module successfully demonstrates gaining full control of the remote devices.

Rapid7 Labs also disclosed that Hikvision DVRs were among the many unsecured security cameras and devices with default passwords, allowing anyone to view the security surveillance footage. The truth is that these devices come with a default administrative account i.e. “admin” and have a universally used password-‘12345’ which ultimately leads to full remote compromise.

Rapid7 found that around 150,000 Hikvision DVRs were vulnerable to remote access across the public IPv4 address space and were prone to remote shut down activities, usually taken up by hackers.

Rapid7 clearly disclosed that despite repeated attempts to contact the vendor regarding the vulnerabilities that would allow an attacker to gain full control of devices and exploit them, no one from Hikvision responded in-support or oppose to this theory. The security firm said that it contacted vendor about the flaws on September 15th, and then in October, the flaws were disclosed to CERT coordination center which then assigned CVE identifiers.

When the Chinese company did not respond in a cordial way, Rapid7 told the public about the vulnerabilities on November 19th, 2014.

SANS Institute researcher Johannes Ullrich previously discovered that many Hikvision DVRs were being exploited by “The Moon” worm. He also accused that the infected devices were part of a botnet and were being used for Bitcoin mining and code scanning for Synology disk stations. Johannes clearly stated that the main exploit vector was the default root password of ‘12345’ which never got changed.

As soon Ullrich’s discovery reached social media, Hikvision officials reacted and promised to protect the user’s interest.  But later the Hikvision Security Response Center condemned the hacking station which damages the user’s interest taking flaw test as its excuse.

Ullrich responded to the comment by saying that “Ummmm, no hacking is needed to steal user privacy if the vendor allows the weak default password to be used”.

In order to mitigate these exposures, until a patch is released, Rapid 7 is said to have advised Hikvision DVR makers to stop the products from being exposed to internet without the usual additional protective measures, such as authenticated proxy, VPN-Only access and so on…

The security firm said that Hikvision did not respond cordially to this advice of them and did not offer any kind of solution till date.

Hikvision point of view- This company which is known to be a leader in offering cutting edge video surveillance solutions, said that their focus is not merely on offering security products, but total security solutions for the broad variety of vertical markets. It hinted that the discovery was a clear cut misunderstanding or an activity of competitors to malign the company image.

StorageServers Comment- Whatever may be the motive of Rapid7; the fact is that the image of Hikvision is getting tarnished in the surveillance market. If this continues, then there is a chance that the Chinese maker may loose the US market within no time. So, the only solution from our point of view is that Hikvision should come up with a fool-proof solution to keep the media and its encouraging resources tight-lipped. At the same time, it should also try to stay in the good books of Rapid7 from now on.

Hope, some one from Hikivision takes this matter seriously atleast from now on and tries to sort out this issue amicably as soon as possible.

Google will keep a track of your online shopping on this Black Friday and thanksgiving!

Google, the internet juggernaut is all set to keep a track of your online shopping deals on this black Friday and thanksgiving. So, for all you people, who are busy with shopping on this long weekend, this is just an update. But to those, who have decided to go for online shopping, this is a factual piece of info to keep in mind.

Google is planning to keep a track of all the online shopping deals in which its search patrons have participated. It will track what were the products which were on high demand and the websites selling those products and at what price. Then the internet search giant is all set to create a database of the deals which were searched, made and reverted in this festive season. Thus, by tracking this data, it is clear that the search giant is interested in creating a database of what its users have purchased online in this thanksgiving season. This feature will also give Google a visibility of how many transactions took place in this holiday season.

A reliable source from Google has disclosed that the search giant is interested in selling this data to many corporate companies which are into manufacturing and selling business. The purchased data will then be analyzed by the company sales head, who will in-turn get an idea what were the products which were on demand in this season, and what could be, in the next festive season.

Stuart Lynch, a business analyst from a reputed Fortune 500 company manufacturing electronics said that Google has been offering such services to sales departments of reputed firms since 2008. But for some reason, it kept this disclosure as a secret. Now, the company wants to get into the open and so has decided to publicize its online shopping analysis to the world.

It has to be notified over here that Facebook also indulges in a similar activity and has been doing it from the past 4 years.

Pauline Thatcher who works for a firm named ‘Ventura Analytics’ says that this kind of generic data adds vigor to the field of big data analytics. Pauline also feels that this data has the potential to turn into a boon to some manufacturers (irrespective of their business vertical) if carefully harvested and analyzed.

So, in this festive season all your thanksgiving deals 2014, thanksgiving deals, black Friday 2014, black Friday 2014 deals will be tracked by Google and this may lead to better deals in future.

 

SanDisk promises 16TB SSDs by 2016 and says it will drop Intel way behind

SanDisk Corporation, the American giant which develops flash memory has promised that it is going to offer 16TB SSDs by 2016 and said that Intel will be left way behind with its 10TB SSD offering by then.

Reacting to the news that Intel will present a 10TB sample SSD by 2015, a SanDisk enterprise storage bigwig said that his company is all set to commercially offer a 16TB Solid-State Drive by early 2016. So, he felt that Intel will be way behind in this offering.

Brian Cox, SanDisk’s senior director in marketing for enterprise storage, participated at a recent Gartner Data Center Summit in London and said that his company which currently offers 4TB drive technology would develop at a 2X/year capacity pace, to offer 8TB by 2015 and 16TB by early 2016.

On a recent note, Sandisk bought Fusion-io and obtained through it PCIe flash hardware and software and also the ioControl hybrid array products brought by Fusion-io after acquiring Nexgen.

Storage industry veterans feel that SanDisk has achieved a great level of confidence after acquiring Fusion-Io and that is clearly reflecting in its recent product announcements.

SanDisk is offering ‘Bit Cost Scalable NAND’ (BICS) as a part of its exploration with 3D NAND technology. It will be used instead of current 2D or planar NAND when it makes economic sense. The company estimates that TLC BICS implementation will come along eventually and will be fully implemented in a couple of year’s time.

SanDisk lists another technology in the offering which is ULTRADIMM and says that the product will offer lower data access latency than PCIe flash and SanDisk lists Lenovo, Super Micro and Huawei as OEM customers taking the product.

Data Storage for Medical Industry!

Data Storage needs are increasing in the world of medical industry and so in parallel, new healthcare data regulations and considerations are coming into place. As both new technology and new regulations are mounting, clear IT Strategies and partnerships with reliable data storage partners such as DNF Medical are turning crucial for healthcare organizations and their associated businesses.

In the year 1996, the Health Insurance Portability and Accountability Act-shortly called as HIPAA, was signed into a law. The basic goal was to standardize the way healthcare data like billing info, lab reports, clinical data, patient reports and so forth- was being exchanged between healthcare providers and organizations to secure and protect the confidentiality of patient medical history and related data.

In other words, the law was intended to strictly isolate the info to only those people, who were authorized to know the details. And to keep it out accidentally or on purpose, with people who have no reason to see it. These stipulations were to reduce any kind of healthcare fraud and abuse, lower healthcare costs and improve access to health insurance.

These regulations were applied to most healthcare organizations and providers, including doctors, dentists, insurance plans and healthcare clearing houses, whether private or public. The said law was to cover all the entities mentioned above in order to establish appropriate measures to deal with the physical, technical and administrative components of patient data privacy.

In the year 2003, the security rule was added which was referred to dealing specifically with electronic patient data. Prior to this law, there was no universal set of standards for dealing with digitalized patient data. While relatively new to the medical field, computers were being adopted quickly, and data digitalization was promised to make the healthcare industry more efficient and cost effective.

As per the new law, HIPAA covered entities must

  1. Ensure the confidentiality, integrity and availability of all electronic protected health information the covered entity creates, receives and maintains or transmits.
  2. Protection against any reasonably anticipated threats or hazards to the security or integrity of such information.
  3. Ensure compliance with this subpart by its workforce.
  4. Protect against any reasonably anticipated uses or disclosures of such information that are not permitted or required.

In order to go for the above said rules, the entities are required to implement a series of backup, disaster recovery and emergency plans

Data backup plan- A data backup plan has to be established and implemented to create and maintain retrievable exact copies of electronic protected health information.

Disaster recovery is essential- The entities must have a disaster recovery plan in advance, in order to establish and implement procedures to restore any loss of data.

Emergency mode of operation- Establish and implement procedures to enable continuation of critical business process for protection of the security of electronic protected health information while operating in emergency mode.

In each covered entities are afforded considerable flexibility in how to comply, but each must limit access to data, secure data with measures such as encryption, implement audit controls, and secure health information that is transmitted over electronic networks.

As technology and regulations in healthcare industry are growing, storing health related data is also becoming crucial. Some health care providers are going for the option of collocation, while some are thinking to secure it their location faster access, better security, to maintain integrity and ensure patient satisfaction.

Since, the sheer amount of health data continues to grow exponentially, data storage must grow with it. But health care organizations are finding it difficult to select a vendor which can offer solutions that are reliable, secure, scalable and affordable.

DNF Medical can tailor the solutions required by the organizations as it offers products for every medical imaging and healthcare data management requirements, with fully integrated tower or rack mounted medical imaging management appliances required for mission critical deployments.

For more details call 510.265.1122 or click on DNF Medical Genesis appliances

Hyper Converged infrastructure makes virtualization a boon to businesses

Server Virtualization is being embraced by many businesses, irrespective of their size and operational vertical. But still, some IT teams from noted business firms are feeling intimidated by getting up to the speed on the technology. Therefore, in order to get rid of their fear, they are in a look out for a solution that ensures all integrated components work together automatically and transparently in the background.

The technology of Hyper Convergence will prove as a boon in this situation, where it helps in making the best use of Virtualization in an enterprise IT environment. Hyperconvergence technology gives each application the right amount of IOPS, throughput and caching it needs, along with data protection and recoverability requirements.

Virtualized appliances in the space differ from converged systems in that they go beyond the packaging of racked server, storage and networking components from brand name vendors, each bound to its own separate management functions and requiring a level of IT expertise to deploy and maintain, even if accessible under a single interface.

But these troubles can easily be solved with the help of new hyper converged appliances, where we can blur lines across what once were separated by boundaries, using commodity hardware components with a focus on scale-out, high performance and highly efficient storage.

With the help of these systems, a building-block approach can be attained, as they make addition of more nodes to the total pool easy and also simplify the networking or processing power needed for automated operations. And thus offer a comprehensive central management layer that eliminates the need for administrators to develop a wholesale infrastructure expertise.

Neil Medical Group, a long term care Pharma and Medical supply distributor with a division that handles Medicare Part B billing service, is taking advantage of hyper convergence technology, in order to get the best out of its virtualized infrastructure. The said medical group was running physical servers attached to separate Storage Area Networks until recent times in two of their data centers located in Kinston and Morresville, North Carolina.

Last year, the infrastructure reached the end of its life and so the medical group was looking for a forklift upgrade. The IT director had two options to choose; out of which one was to reproduce what was already there or choosing an option that would let virtualization be used to the core and improve backup and disaster recovery operations.

The IT Director Chad Benfield choose to go for virtualization as he thought it made more sense for critical apps, not just minor functions. So, they went with a company which offered Hyper Converged appliances offering virtualization at its core.

As a result, now, only 4 IT staff members support Neil’s 300 users, who require access to enterprise resource planning, pharmacy information systems and warehouse management software. Apart from the said number, hundreds of external customers and partners also interface with the company through its e-commerce website, pharmacy portals and other venues.

Consequently, when so many benefits are being offered by a Hyper Converged appliance, then what are you waiting for?

StoneFly, Inc., a subsidiary of Dynamic Network Factory is offering a Unified Storage & Server Hyper Converged appliance in its product catalog. With this appliance, consolidation of an enterprise storage and sever system into one easy to manage appliance is possible. With the use of a virtualized operating system, complete hardware utilization and considerable reduction in power and cooling costs can be observed.

StoneFly™ USS Hyper Converged Infrastructure helps in replacing the fixed hardware model of the past with on-demand resource allocation based on your application needs.

For more details call 510.265.1616 or click on StoneFly™ USS Unified Storage & Server Hyper Converged Appliance

Follow

Get every new post delivered to your Inbox.

Join 69 other followers