Skip to content

Edge Data Centers are changing the internet geography!

Online streaming of video content is increasing all over the world and thus is giving a tough time to those offering cable TV or satellite services. At this juncture, the physical nature of internet infrastructure is also changing as businesses are consuming more and more cloud services versus buying hardware boxes and software licenses. Whether your drug is ‘How I met your mother’ on Netflix or ‘FireFly’ on Amazon Prime, the web content companies are doing everything to make sure you can binge on high definition at the earliest. The same applies to cloud services, for whom performance at the users’ end really matters.

Therefore, in recent years, online content service providers have been busy to improve the quality of these high-bandwidth consuming content to users outside of the top metros like New York, Los Angeles or San Francisco. And the best way to do so, is caching the most popular content or web application data on servers closer to the so called “ tier-2 markets” places like Phoenix, Minneapolis, or St Paul. This makes the content readily available close to the users and is immediately accessible as per their demand.

Therefore, caching the content closer to users is also creating a new category of data center service providers that call their facilities “Edge Data Centers”. These facilities operate only for delivering video content via internet and operate on the concept of “Edge” which is different from the traditional internet hubs in places like New York, Northern Virginia, Dallas, or Silicon Valley.

Thus, architecting something that can be truly called an “edge data center” requires a different set of considerations than building your standard collocation facility.

Here, the emphasis will be more on creating interconnection ecosystems in cities away from the traditional core markets.

Edge Connex, 365 data centers, vXchange are all examples of edge data centers. And FYI, the EdgeConneX went from zero data centers two years ago, to two dozen today and is still growing. Also vXchange bought eight SunGard data centers for the sole purpose of offering video streaming via cache.

Moreover, some edge data centers are also involving in practices of exchanging traffic with other content providers in edge data center industry to better manage their networks.

At the same time, the demand dynamics are also changing the design and expansion strategy of these edge data centers. Previously, a company whish started small with 1.2 megawatt power consumption has scaled up to consume 10 megawatts in 2 megawatts chunks in just 2 years. This surely says the way these data centers are offering cached video content to users are growing in business perspective.

Keeping the growth prospects in line, a typical customer starts with about 30 kW to cache content that is on great demand in a specific market. As their requirements grow overtime, and reach about 50 kW, they start caching infrastructure with a network node of their own. Once they gain a node of their own, they start looking at 100kW deals with the provider that can eventually grow into 200kW deployments.

But to expand at these great velocities the companies need a great investment and that depends on how well they are flourishing in the market of video content streaming.

Remember—“All that glitters is not gold!”

 

Having a backup copy in cloud will always ensure data continuity!

Backups are essential for disaster recovery and to keep the enterprise business continuity planning intact and in-place. So, having backup software or an appliance in your storage environment will always work in your favor, when a storage network goes down and the need arises to keep the data flowing to your clients and customers.

But what will happen, if your data center is hit with a catastrophic disaster like Fire, Flood, earthquake or a tsunami.

No matter, how well you are equipped to deal with these situations; sometimes you cannot escape the wrath of Mother Nature.

Here’s where StoneFly, Inc., offers a solution to deal with these situations. If your business is depending on data and needs data flow all time, you can go for a StoneFly Cloud Backup Connect. It is an easy and affordable way to begin storing your valuable server and workstation backups in the cloud.

StoneFly Cloud Backup Connect empowers organizations to fully utilize their existing backup software or existing backup appliance’s disaster recovery features, enabling organizations large and small to implement offsite backup inexpensively.

Compatible with any existing enterprise backup software or backup appliance you might be already using (Symantec Backup Exec, Acronis, Commvault, Veeam, NovaStor, Unitrends, etc.), StoneFly provides a backup connector to your choice of private, hybrid or public cloud. Implement a quick and simple disaster recovery plan while storing your backups off-site in Microsoft Azure, VMware vCloud, VMware vCloud Air, StoneFly Cloud Business Center, or your remote private or hybrid cloud.

To know more call 510.265.1616 or click on StoneFly Cloud Backup Connect

Toshiba to offer 128TB SSD!

Toshiba is all set to offer 128TB SSDs by 2018. The company believes that the new type of NAND flash memory will not only help in reducing costs of non-volatile memory, but will also enable makers to build solid-state drives with unprecedented capacities.

Toshiba is planning to introduce quad-level cell (QLC) and BiCS (bit cost scalable) 3D NAND flash by 2017, which will greatly reduce costs of non-volatile memory.

Based on this new memory, Toshiba will offer 128TB high capacity solid state drives by early 2018. Such drives will be based on QLC and BiCS NAND flash memory.

However, for datacenters to take the advantage of such high capacity drives, they surely need to upgrade their software.

Toshiba is confident that capacities of solid-state drives will be considerably higher than capacities of leading-edge hard disk drives. Toshiba does not reveal how much a 128TB SSD can cost three years from now, but it is likely that the price of such solid-state drives will be considerably higher than the price of multiple hard disk drives of such capacity.

Hope, the hard drives makers are listening!

 

DNF Security offers NVR with cloud storage integration!

DNF Security, a business unit of Dynamic Network Factory is offering video surveillance viewing and storage platform with cloud storage integration facility. This combo NVR series of video storage and video viewing platforms are the foundation of reliable, easy-to-deploy IP Surveillance solutions.

DNF Security Falcon series appliances goes above conventional analog CCTV and traditional DVR systems to deliver unmatched performance, scalability and compatibility.

The falcon series of appliances from DNF Security are enriched with high performance features which include 64-bit dual or quad-core Intel 4th-generation Haswell processors, up to 32GB high-speed video cache (8GB standard), and dual copper gigabit Ethernet connectivity.

These fault-tolerant data storage appliance comes in a convenient, space-saving tower or 4U rackmount chassis with hot-swappable drives and trays. User data will be protected with your choice of RAID 0, 1, 5, 6, 10, 50, or 60 across up to eight hot-swappable disk drives on a high-performance RAID controller.

The highlight is that all Falcon Series NVR and Video Storage appliances support an optional cloud connection for video archiving and remote access needs. Live recordings continue to be stored directly on the Falcon combo NVR series of video storage and video viewing platform, while archived footage can be written to and read from a secure StoneFly Cloud Drive hosted in Microsoft Azure or the StoneFly Cloud Business Center– a private cloud.

By combining a StoneFly Cloud Drive subscription with a Falcon Video Storage Appliance will immediately save on capital and operating expenses, enabling organizations large and small to implement offsite video archival and backup inexpensively.

Moreover, Falcon Combo NVR integration with cloud is highly customizable and so users can use the integration for their cloud based surveillance needs like remote monitoring and such.

To know more call 510.265.1122 or click on DNF Security Falcon Combo NVR Series appliances

Can 4K video surveillance boost safety and security in hospitals?

Hospitals located all over the world are faced with a number of security challenges. This includes ensuring the safety of patients and staff to preventing theft, with the added complication of very limited budgets for security.

At this point video surveillance solutions turn critical in case of hospital security, particularly in high-risk areas such as maternity wards, pharmacies and parking lots.

This is where 4K video surveillance or implementation of Ultra High Definition surveillance systems can assist hospitals in ensuring effective monitoring in essential areas, while improving cost effectiveness, reducing management as well as maintenance fees and improving overall safety & security.

One of the most significant security challenges in hospitals located in developing nations like Africa and India are their layout. The facilities are usually large and sprawling, with many different floors and areas linked by corridors. Also large hallways and reception areas always teamed with patients and visitors is also a concern in security point of view.

Since, Hospitals operate 24 hours a day, seven days a week and 365 days a year, there is no downtime and as such, security solutions must be constantly operational. People are constantly moving in and out of the various areas, from doctors and patients to visitors, deliveries, maintenance and cleaning staff and more, creating a complex and constantly busy environment.

As cases on baby snatching and drug robbery from stores are increasing, security has to be tightened from all corners to keep these activities under check.

While the need for security, particularly in surveillance POV, is clear, hospitals often have very limited budgets for security. Centralized onsite monitoring is essential, as a rapid response to any situation is critical. In addition, surveillance needs to be scalable, able to integrate with existing security technologies, and should also allow for integration with solutions such as access control to deliver enhanced security.

With cameras covering high risk areas to identify the perpetrators, preventing theft will be a simple task to cover. Within maternity wards, surveillance cameras can be linked to access control to verify that only authorized persons are allowed into the facility where babies are kept.

Both the said areas can benefit from the implementation of 4K cameras, which provide significantly increased video resolution. This enables people viewing the footage to zoom in with excellent quality and clarity and identify more accurately.

The high resolution 4K cameras can also turn highly helpful in parking lots and reception areas, as a single camera is able to cover a wide area with great levels of detail. One 4K camera can cover the same area which is being covered by 5 HD surveillance cameras. Thus, this will dramatically improve cost effectiveness by reducing maintenance and TCO.

Moreover, a single 4K camera enriched with intelligence, can also scale footage even if low bandwidth is available, further improving the cost of running the solution.

Finally, we can come to a conclusion that hospitals are an essential service to all communities, but are also prone to a large variant of security threats like theft of drugs and equipment to kidnapping, to incidents of doctors and staff being attacked.

Ultra high-definition 4K surveillance cameras are a cost effective solution to help hospitals bolster security without breaking the budget, ensuring that patients, staff and visitors are as safe as possible.

Virtual Servers double the cost of security breach says survey?

Kaspersky lab has released a new report recently which says that a security incident involving virtual server in either public or private cloud will double the recovery cost compared to that of a traditional environment.

In another survey conducted by B2B International, it was discovered that enterprises paid an average of $800,000 to cover a security breach involving virtual servers, compared to $400,000 in traditional environments.

On the other hand, the same survey notified us that SMBs saw costs rise from an average of $26,000 to $60,000 with virtualization.

According to Kaspersky Lab, the three main reasons for this cost difference are

  1. IT professionals are in a myth that virtual servers are naturally secured than their traditional counterparts and so need no extra security precautions.
  2. Secondly, they believe that if at all a virtual machine catches a virus; they can just delete the virtual machine and create a new one from a template.
  3. Around 62% in the survey believed that risk in virtual environments were significantly lower than in physical environment.

The study also revealed that malware is able to hop from one VM to another, embedding itself in the hypervisor and use other techniques to avoid being cleaned out by re-imaging.

The major risk of virtualization discovered in the study is as follows. There can be a window of vulnerability between the time a virtual machine is spun-up and anti-virus software is updated. This window can get dramatically magnified if all the virtual machines need to be updated at once.

As a result, virtualized environments can require security solutions specifically designed to deal with virtual servers.

But it was revealed in the survey that only 13% of the survey respondents had deployed a security solution specifically designed for virtual environments.

B2B International survey revealed that companies were way ahead with disaster recovery plans, when it comes to their traditional infrastructure, but were literally ill prepared for their virtual environments.

The survey of B2B International revealed that virtualization can prove expensive, complicated and lengthy and is often used for the most mission-critical, high-value processes. So, when the infrastructure goes down, so do the processes.

One of the highlights of B2B survey is that 66 percent said that they lost access to business critical information during an incident involving virtualization, compared to 36 percent in a traditional environment. This is because companies are not as prepared to recover from an incident that involves virtualization. Secondly, these incidents also result in a doubling of costs related to lost business, damage to company reputations, damage to credit ratings, and increased insurance premiums.

So, to address these problems, companies need to recognize that virtualization can require different security solutions than traditional environments, and should be thinking about security and disaster recovery from the very start of the virtualization process.

And if you are confused on how to efficiently transform your IT storage configuration with the help of virtualization, without all these hazards approach Dynamic Network Factory Corporation (DNF Corp).

With the help of its

Dynamic Network Factory can help you in gaining the benefits of virtualization to the fullest and will assist your organization in gaining the much needed edge in today’s competitive marketplace.

Windows 10 free upgrade offers these following problems?

Microsoft has offered its new Windows 10 operating system to the entire world on June 29th, 2015 and in less than four weeks the new OS is hit with concerns related upgrade bugs, privacy, massive data consumption and future update worries.

In case, if the above said issues aren’t enough, there’s one more reason to worry about your free Windows 10 install, especially, if you are a PC user who would like constant upgrades to your PC.

Remember, whenever you change the hardware on your Pc, especially major components like a RAM, there’s a chance that windows 10 will not recognize the machine as a PC that’s allowed to run an authorized version of Windows 10. Instead, your PC might be at a risk of labeled as a non-genuine Windows 10 install, and there’s no automated process to make that kind of error vanish.

This is because, Windows 10 doesn’t come with a serial key that you can use and reuse every time you want to install a fresh copy on the same computer. While the system will automatically detect that you’re allowed to use Windows 10 on a selected machine each time you perform a clean install, it might not do the same once you change individual components, including processor or motherboard. It’s because the Microsoft free Windows 10 OS upgrade server registers your PC hardware on its cache and so once the specs differ, you’re in a trouble.

However, Microsoft has issued a fix to this concern and that is to contact its customer support and tell them the whole hardware upgrade story, after which, your Windows 10 license will be activated remotely and that too without the license.

There is other issue as well which Windows 10 users who came from Windows 7 or Windows 8 should be aware of. Even if they purchased a full retail license of either Windows 7 or Windows 8 and then move to Windows 10, that doesn’t give them the right install Windows 10 on a brand-new machine.

Full retail licenses let anyone take their Windows license from one PC to the next, as long as they remove it from the previous machine. But upgrading from a full retail license of Windows 7 or Windows 8 doesn’t get you a free, portable, full retail Windows 10 license as you’d expect.

Also, for those who haven’t applied for a free upgrade of Windows 10, the other issue is that Microsoft is trying to push its free OS upgrade to all those who have kept their PC software update settings as automatic

Some users have reported that Microsoft’s forcible Win10 upgrade is consuming around 750MB of data and some have even reported that the free upgrade is consuming around 3GB of their data bandwidth. And that too for a failed upgrade.

Those who have a FUP limit on your data consumption, you better check this instance, before it gets too late. Just keep a watch on your Wi-Fi settings and your bandwidth download and see for yourself.

The other major concern is actually an embarrassment from children’s POV. Windows 10 is sending a weekly “activity update” on children’s internet browsing and computer history to parents, by default. However, this feature can prove as a boon to parents, as they can keep an eye on their children’s browsing activity, which includes, a list of websites their children have visited, number of hours spend per day on PC, and for how long they have used their favorite apps.

After reviewing all these concerns it is better to buy a Windows 10 license instead of a free upgrade.

FYI, Microsoft recently announced that its new OS is now installed on 75 million devices and will soon inch towards 1 billion mark by this year end.

Follow

Get every new post delivered to your Inbox.

Join 83 other followers