IT Management  >   Systems Operations  >   Data Management  >  

Data Transfer

RSS Feed    Add to Google    Add to My Yahoo!
DEFINITION: A bottleneck, in a communications context, is a point in the enterprise where the flow of data is impaired or stopped entirely. Effectively, there isn't enough data handling capacity to handle the current volume of traffic. A bottleneck can occur in the user network or storage fabric or within servers where there is excessive contention for internal server resources, such as CPU processing power, memory,  … 
Definition continues below.
Data Transfer White Papers (View All Report Types)
 
Hybrid Storage Arrays: Solution Overview
sponsored by Microsoft
WHITE PAPER: This white paper uncovers a hybrid storage array that automates time-consuming data protection and storage capacity scaling so that you can spend less time tending to your storage infrastructures. Read on to learn the benefits this array can provide for your storage.
Posted: 14 Dec 2015 | Published: 31 Jul 2014

Microsoft

Top 10 Reasons to Store More Data in Memory
sponsored by Software AG
WHITE PAPER: As data pools continue multiplying, more and more organizations are moving their growing volumes of data out of disk-based storage systems and remote relational databases and into machine memory. But there’s more to be said about in-memory data storage. Check out this resource to discover the top ten reasons to store more data in memory.
Posted: 21 Jun 2012 | Published: 13 Jun 2012

Software AG

Taming the Capacity Monster
sponsored by Microsoft
WHITE PAPER: Storage needs to be flexible in order to accommodate potentially unexpected data growth without crashing. This white paper discusses ways to attain that flexibility. Read on to learn how to meet the challenge of unstoppable data growth.
Posted: 05 Jan 2016 | Published: 05 Jan 2016

Microsoft

The Economic Evolution of Enterprise Storage
sponsored by Hitachi Data Systems
WHITE PAPER: The latest technologies innovations provide a big, evolutionary stride in economically superior storage designed to swiftly address mounting data challenges and changing business demands. With 3D scaling and dynamic tiering, IT leaders can dynamically scale up, scale out and scale deep for increased performance and scalability.
Posted: 02 Dec 2010 | Published: 02 Dec 2010

Hitachi Data Systems

Migrating Users from Physical Workstations to XenDesktop
sponsored by Citrix
WHITE PAPER: This document provides recommendations and best practices for moving a user’s data from their physical corporate workstations into the virtual Citrix XenDesktop environment.
Posted: 05 Oct 2009 | Published: 05 Oct 2009

Citrix

Guide: Secure Copy
sponsored by ScriptLogic Corporation
WHITE PAPER: Getting data moved over to a new server is more than merely copying the most recent files and folders to that machine. File security, permissions, shared folders and local groups facilitating access need to be maintained.
Posted: 27 Aug 2009 | Published: 27 Aug 2009

ScriptLogic Corporation

IPAM Intelligence: All Road Leads To Proteus
sponsored by BlueCat
WHITE PAPER: As IPAM evolves from a simple marriage between DNS and DHCP services, its definition cannot be limited to simply the benefits derived from dynamically linking DNS and DHCP functionality together. IPAM transcends this marriage to include features and functions shaped by this new requirement in an age of dynamic IP address data.
Posted: 25 Oct 2010 | Published: 25 Oct 2010

BlueCat

Using Data Replication to Upgrade Your Oracle Database with Minimal Downtime and Risk
sponsored by Dell Software
WHITE PAPER: Migrating away from older versions of Oracle that require expensive extended contracts is a choice many companies are making to cut costs over the long term. Find out the best practices that can help ensure your next Oracle system upgrade is an economical success. Read the White Paper >>
Posted: 13 Feb 2014 | Published: 31 Dec 2013

Dell Software

NFS Evolution Changes the Landscape of HPC Data Management
sponsored by BlueArc Corp.
WHITE PAPER: In HPC data management, traditional standards-based solutions have been limited in performance and scalability, but proprietary, high-performance solutions have required specific expertise to set up, manage, or scale. Read this Tabor Research White Paper to learn about file systems used in HPC environments as well as BlueArc storage solutions.
Posted: 02 Jul 2009 | Published: 02 Jul 2009

BlueArc Corp.

Dispelling the Myths of IBM i Data Access
sponsored by SEQUEL-Software
WHITE PAPER: Accessing data in a format that is usable for each type of user is often referred to as Business Intelligence, or BI. When done correctly, BI can be a powerful force in an organization. Unfortunately, that’s not often the case. Part of the problem is the many myths surrounding BI and its basic concepts.
Posted: 14 Jul 2010 | Published: 14 Jul 2010

SEQUEL-Software
 
 
DATA TRANSFER DEFINITION (continued): …  or I/O (input/output). As a result, data flow slows down to the speed of the slowest point in the data path. This slow down affects application performance, especially for databases and other heavy transactional applications, and can even cause some applications to crash.A bottleneck frequently arises from poor network or storage fabric designs. Mismatched hardware selection is a common cause. For example, if a workgroup server is fitted with a Gigabit Ethernet port but the corresponding switch port that connects to the server only offers a legacy 10/100 Ethernet port, the slow switch port will … 
Data Transfer definition sponsored by SearchEnterpriseWAN.com, powered by WhatIs.com an online computer dictionary

About TechTarget:

TechTarget provides enterprise IT professionals with the information they need to perform their jobs - from developing strategy, to making cost-effective IT purchase decisions and managing their organizations' IT projects - with its network of technology-specific Web sites, events and magazines

All Rights Reserved, Copyright 2000 - 2016, TechTarget | Read our Privacy Statement