Data Efficiency

By Louis Imershein | Articles about deduplication, compression and thin provisioning for storage manufacturers

Using Data Reduction at the OS layer in Enterprise Linux Environments

<b>CAMBRIDGE, Mass – March 14, 2017 –</b> Enterprises and cloud service providers that have built their infrastructure around Linux should deploy data …

Breaking up is hard to do: busting the handcuffs of traditional data storage

PremiseThe largest and most successful Web companies in the world have proven a new model for managing and scaling a combined architecture of compute …

Cloud Computing

Addressing Bandwidth Challenges in the Hybrid Cloud

Any application infrastructure that relies on a single data center is only as safe as that data center’s physical resources and the competence of its …

Cloud Computing

Permabit VDO 6 Now Available for Storage OEMs and Software Defined Data Centers - Permabit

<i>Permabit delivers on its promise to maximize density and lower TCO for on-premises and cloud-based deployments</i><p><b>CAMBRIDGE, Mass – March 8, 2017</b> …

Cloud Computing

Reduce Cloud's Highly Redundant Data

Storage is the foundation of cloud services. All cloud services – delineated as scalable, elastic, on-demand, and self-service – begin with storage. …

Netflix mobile streaming should soon cost you much less data

Netflix has traditionally focussed on television screens, and for good reason. Around two-thirds of all Netflix hours are viewed on traditional TV …

Netflix

How Flash Storage is Powering the Public Sector’s Data Centers

<i>This blog is an excerpt from GovLoop’s recent industry perspective, Powering the Public Sector’s Next-Generation Data Center with Flash Storage.</i> …

Big Data

Transparent Data Reduction in the Private Cloud

Tom Cook, Permabit’s CEO, recently discussed how to control private cloud infrastructure and software costs by utilizing data reduction software. …

3PAR Dedupe + Compression Deep Dive

Getting Trim<p>HPE are a bit late on this release, it’s normally January that people want to start losing weight. Well not 3PAR it’s gut busting, data …

Google's Brotli Data Compression Algorithm Can Make The Internet "A Lot Faster"

<i>Short Bytes: Brotli is new open source data compression library developed by Google. Its lossless compress algorithm manages to outperform the</i> …

Data Deduplication - An Introduction To What Deduplication Is

Introduction<p>Long gone are the days when software installers came on 3.25 inch disks and CDs could be considered a corporate backup medium. Storage …

Controlling Infrastructure and Software Costs in the Private Cloud

Organizations choose private cloud deployments over public when they want to combine the increased flexibility of the cloud model with the privacy …

Cloud Computing

Effective use of data reduction in the Public Cloud

Permabit CEO, Tom Cook recently wrote about how data reduction technology can simplify the problems associated with provisioning adequate storage …

Big Data

Data Efficiency in Public Clouds

Public cloud deployments deliver agility, flexibility and elasticity. This is why new workloads are increasingly deployed in public clouds. Worldwide …

What Product Breakthroughs Will Recent Advances in Deep Learning Enable?

<i>What product breakthroughs will recent advances in deep learning enable? originally appeared on Quora:</i> <i>the knowledge sharing network where compelling questions are answered by people with unique insights</i>.<p><b>Answer by Eric Jang, Research engineer at Google Brain, on Quora:</b><p>Deep Learning refers to a class …

Machine Learning

MariaDB embraces big data with general availability of ColumnStore 1.0

<i>Pluggable Columnar Engine Option Delivers Single SQL Interface for Analytic and Transactional Workloads</i><p><b>MariaDB® Corporation, the company behind the</b> …

Big Data

Light Loss

Shunji Funasaka, Koji Nakano, Yasuaki Ito<p>Hiroshima University<p>In book: Algorithms and Architectures for Parallel Processing, pp.281-294, …

Machine Learning

HPE Strengthens HCI Hand by Acquiring SimpliVity

Hewlett-Packard Enterprise, as part of an effort to become more competitive in the red hot hyperconverged infrastructure (HCI) market, announced this …

Cloud Computing

Parallel Multiway Methods for Compression of Massive Data and Other Applications

In this Invited Talk from SC16, Tamara Kolda from Sandia presents: <i>Parallel Multiway Methods for Compression of Massive Data and Other</i> …

Data Science

How the Flip Feng Shui technique undermines cloud security

Something everyone in security should learn is that, when the consequences and impacts are high enough, it's advantageous to retain scenarios that …

SEO

Understanding the importance of data deduplication for the modern data centre

<i>By Staff Writer</i><p>Data deduplication might seem like a new thing but the reality is that the technology has been around for quite some time.<p>Warren …

Filling the Linux Data Reduction Gap – Permabit Briefing Note

Most data centers consider data reduction a “must have” feature for storage. The data reduction software should be able to deliver its capabilities …

Latest VDO Performance Results – High-end Performance Meets Software Defined Storage

Today Permabit announced that our Virtual Data Optimizer (VDO), using HIOPS Compression, achieved 8 GB/s throughput when tested on the Samsung NVMe …

Big Data

Permabit VDO Delivers Record-setting Performance on Samsung's NVMe Reference Design Platform

CAMBRIDGE, Mass., Dec. 21, 2016 /PRNewswire/ -- Permabit Technology Corporation, the data reduction experts, announced today that its Virtual Data …

Permabit Hits New Milestone in 2016 by Delivering the First Complete Data Reduction for Linux

BOSTON, Dec. 7, 2016 /PRNewswire/ -- Permabit Technology Corporation, the data reduction experts, brought complete storage efficiency to Linux in …

Discover new compression innovations Brotli and Zstandard

Brotli and Zstandard are two recent lossless compression algorithms. Discover more about them and how The Guardian is using them in production.<p>In 1948, Claude Shannon published an extraordinary article, defining for the first time a mathematical model of information and determining the maximum …

Web Development

Optimizing run-length algorithm using octonary repetition tree. (arXiv:1611.09664v1 [cs.DS])

Compression is beneficial because it helps detract resource usage. It reduces data storage space as well as transmission traffic and improves web …

Algorithms

What are some recent backup data deduplication advancements?

Copy data management is just one technology utilizing recent innovations in the backup deduplication space, combatting sprawl and managing snapshots.

Big Data

Permabit pulls on Red Hat, opens arms for a Linux cuddle

Crimson headcover kernel gets dedupe and compressionThe Mad Hatter of Linux is getting Alice in Wonderland style physical space virtualisation with …

Red Hat