Condusiv Technologies Blog

Condusiv Technologies Blog

Blogging @Condusiv

The Condusiv blog shares insight into the issues surrounding system and application performance—and how I/O optimization software is breaking new ground in solving those issues.

Finance Company Deploys V-locity I/O Reduction Software for Blazing Fast VDI

by Dawn Richcreek 27. November 2018 05:27

When the New Mexico Mortgage Finance Authority decided to better support their users by moving away from using physical PCs and migrating to a virtual desktop infrastructure, the challenge was to ensure the fastest possible user experience from their Horizon View VDI implementation.

“Anytime an organization starts talking about VDI, the immediate concern in the IT shop is how well we will be able to support it from a performance standpoint to ensure a pristine end user experience. Although supported by EMC VNXe flash storage with high IOPS, one of our primary concerns had to do with Windows write inefficiencies that chews up a large percentage of flash IOPS unnecessarily. When you’re rolling out a VDI initiative, the one thing you can’t afford to waste is IOPS,” said Joseph Navarrete, CIO, MFA.

After Joseph turned to Condusiv’s “Set-It-and-Forget-It®” V-locity® I/O reduction software and bumped up the memory allocation for his VDI instances, V-locity was able to offload 40% of I/O from storage resulting in a much faster VDI experience to his users. When he demo’d V-locity on his MS-SQL server instances, V-locity eliminated 39% of his read I/O traffic from storage due to DRAM read caching and another 40% of write I/O operations by solving Windows write inefficiencies at the source.

After seeing the performance boost and increased efficiency to his hardware stack, Joseph ensured V-locity was running across all his systems like MS-Exchange, SharePoint, and more.

“With V-locity I/O reduction software running on our VDI instances, users no longer have to wait extra time. The same is now true for our other mission critical applications like MS-SQL. The dashboard within the V-locity UI provides all the necessary analytics about our environment and view into what the software is actually doing for us. The fact that all of this runs quietly in the background with near-zero overhead impact and no longer requires a reboot to install or upgrade makes the software truly “set and forget,” said Navarrete.

 

Read the full case study                        Download 30-day trial

Fix SQL Server Storage Bottlenecks

by Spencer Allingham 23. October 2018 20:58

No SQL code changes.
No Disruption.
No Reboots.
Simple!

 

Condusiv V-locity Introduction

 

 

Whether running SQL in a physical or virtualized environment, most SQL DBAs would welcome faster storage at a reasonable price.

The V-locity® software from Condusiv® Technologies is designed to provide exactly that, but using the storage hardware that you already own. It doesn't matter if you have direct attached disks, if you're running a tiered SAN, have a tray of SSD storage or are fortunate enough to have an all-flash array; that storage layer can be a limiting factor to your SQL Server database productivity.

The V-locity software reduces the amount of storage I/O traffic that has to go out and be processed by the disk storage layer, and streamlines or optimizes the data which does have to still go out to disk.

The net result is that SQL can typically get more transactions completed in the same amount of time, quite simply because on average, it's not having to wait so much on the storage before being able to get on with its next transaction.

V-locity can be downloaded and installed without any disruption to live SQL servers. No SQL code changes are required and no reboots. Just install and typically you'll start seeing results in just a few minutes.

Microsoft SQL Server I/O Reliability Certification LogoBefore we take a more in-depth look at that, I would like to briefly mention that last year, the V-locity software was awarded the Microsoft SQL Server I/O Reliability Certification. This means that whilst providing faster storage access, V-locity didn't adversely affect the required and recommended behaviors that an I/O subsystem must provide for SQL Server, as defined by Microsoft themselves.

Microsoft ran tests for this in Azure, with SQL 2016, and used HammerDB to generate an online transaction processing type workload. Not only was V-locity able to jump through all the hoops necessary to achieve the certification, but it was also able to show an increase of about 30% more SQL transactions in the same amount of time.

In this test, that meant roughly 30% more orders processed.

They probably could have processed more too, if they had allowed V-locity a slightly larger RAM cache size.

To get more information, including best practise for running V-locity on MS SQL servers, easy ways to validate results, customer case studies and more, click here for the full article on LinkedIn.

If you simply want to try V-locity, click here for a free trial.

Use the V-locity software to not only identify those servers that cause storage I/O issues, but fix those issues at the same time.

Financial Sector Battered by Rising Compliance Costs

by Dawn Richcreek 15. August 2018 08:39

Finance is already an outlier in terms of IT costs. The industry devotes 10.5% of total revenue to IT—and on average, each financial industry IT staffer supports only 15.7 users, the fewest of any industry.

All over the world, financial services companies are facing skyrocketing compliance costs. Almost half the respondents to a recent Accenture survey of compliance officers in 13 countries said they expected 10% to 20% increases, and nearly one in five are expecting increases of more than 20%.

Much of this is driven by international banking regulations. At the beginning of this year, the Common Reporting Standard went into effect. An anti-tax-evasion measure signed by 142 countries, the CRS requires financial institutions to provide detailed account information to the home governments of virtually every sizeable depositor.

Just to keep things exciting, the U.S. government hasn’t signed on to CRS; instead we require banks doing business with Americans to comply with the Foreign Account Tax Compliance Act of 2010. Which requires—surprise, surprise—pretty much the same thing as CRS, but reported differently.

And these are just two examples of the compliance burden the financial sector must deal with. Efficiently, and within a budget. In a recent interview by ValueWalk entitled “Compliance Costs Soaring for Financial Institutions,” Condusiv® CEO Jim D’Arezzo said, “Financial firms must find a path to more sustainable compliance costs.”

Speaking to the site’s audience (ValueWalk is a site focused on hedge funds, large asset managers, and value investing) D’Arezzo noted that finance is already an outlier in terms of IT costs. The industry devotes 10.5% of total revenue to IT, more than government, healthcare, retail, or anybody else. It’s also an outlier in terms of IT staff load; on average, each financial industry IT staffer supports only 15.7 users, the fewest of any industry. (Government averages 37.8 users per IT staff employee.)

To ease these difficulties, D’Arezzo recommends that the financial industry consider advanced technologies that provide cost-effective ways to enhance overall system performance. “The only way financial services companies will be able to meet the compliance demands being placed on them, and at the same time meet their efficiency and profitability targets, will be to improve the efficiency of their existing capacity—especially as regards I/O reduction.”

At Condusiv, that’s our business. We’ve seen users of our I/O reduction software solutions increase the capability of their storage and servers, including SQL servers, by 30% to 50% or more. In some cases, we’ve seen results as high as 10X initial performance—without the need to purchase a single box of new hardware.

If you’re interested in working with a firm that can reduce your two biggest silent killers of SQL performance, request a demo with an I/O performance specialist now.

 

For an explanation of why your heaviest workloads are only processing half the throughput they should from VM to storage, view this short video.

 

Which Processes are Using All of My System Resources?

by Gary Quan 17. July 2018 05:50

Over time as more files and applications are added to your system, you notice that performance has degraded, and you want to find out what is causing it. A good starting point is to see how the system resources are being used and which processes and/or files are using them.

Both Diskeeper® and SSDkeeper® contain a lesser known feature to assist you on this. It is called the System Monitoring Report which can show you how the CPU and I/O resources are being utilized, then digging down a bit deeper, which processes or files are using them.

Under Reports on the Main Menu, the System Monitoring Report provides you with data on the system’s CPU usage and I/O Activity.

 

The CPU Usage report takes the average CPU usage from the past 7 days, then provides a graph of the hourly usage on an average day. You can then see at which times the CPU resources are being hit the most and by how much.

Digging down some more, you can then see which processes utilized the most CPU resources.

 

The Disk I/O Activity report takes the average disk I/O activity from the past 7 days, then provides a graph of the hourly activity on an average day. You can then determine at which times the I/O activity is the highest.

Digging down some more, you can then see which processes utilized the I/O resources the most, plus what processes are causing the most split (extra) I/Os.

 

You can also see which file types have the highest I/O utilization as well as those causing the most split (extra) I/Os.  This can help indicate what files and related processes are causing this type of extra I/O activity.

 

So, if you are trying to see how your system is being used, maybe for performance issues, this report gives you a quick and easy look on how the CPU and Disk I/O resources are being used on your system and what processes and file types are using them. This along with some other Microsoft Utilities, like Task Manager and Performance Monitor can help you tune your system for optimum performance.

Optimizing Cameras, GPS, Smart Phones and other SD Card devices

by Michael 9. January 2010 09:36

Rumack: Fragmentation slows down SD Card performance!

 

Striker: Surely you can't be serious.


 

Rumack: I am serious... and don't call me Shirley.    

 

The word is slowly [but surely :-)] getting out that fragmentation affects NAND Flash storage - specifically free space fragmentation. The SD Association website is a great resource for edification and development resources. On their page describing the SD Card Speed Classes, they even discuss fragmentation.

 

 

Fragmentation and Speed

The memory of a card is divided into minimum memory units. The host writes data onto memory units where no data is already stored. As available memory becomes divided into smaller units through normal use, this leads to an increase in non-linear, or fragmented storage. The amount of fragmentation can reduce write speeds so higher SD card speeds help compensate for fragmentation.

 

There are several methods to address this issue. One is, as mentioned, buy better performing storage and hope your requirements never exceed the cards ability to deliver on your needs (though fragmentation will still limit the storage's peak performance). However, a better approach is to fix the root issue, and there are two ways to do that.

1. Defragment the free space (e.g. HyperFast, Diskeeper)

2. Copy all the data off the SD Card and reformat the card

The best approach is likely to be determined by the device in which the SD Card is used. If it's from your digital camera, option 2 is probably pretty easy to undertake. If pulling all the data off the card is not feasible, optimize it.

The only other question then might be "how often do I run optimization?". I did a blog post on that, including some performance tests, a few years ago, Read that Here or Here.

Lastly, I thought I'd include a recent personal success from an IT professional who happens to also use Diskeeper at work:

"Happy new year!! I made an interesting test over the holidays using Diskeeper 2010. I was updating my GPS with new maps and discovered that my Garmin unit uses Fat32 file system so I figured I would run your software on it just for fun and see the results. I was able to defragment a large part of the files and it more than doubled the speed of the unit. The images render faster with less stutter, routes are recalculated almost instantly now and it finds points of interest much faster. I did the same test on 2 other GPS`s and got the same results." Regards, Carlo

Tags:

Defrag | SD Card | SSD, Solid State, Flash

RecentComments

Comment RSS

Month List

Calendar

<<  December 2018  >>
MoTuWeThFrSaSu
262728293012
3456789
10111213141516
17181920212223
24252627282930
31123456

View posts in large calendar