Condusiv Technologies Blog

Condusiv Technologies Blog

Blogging @Condusiv

The Condusiv blog shares insight into the issues surrounding system and application performance—and how I/O optimization software is breaking new ground in solving those issues.

Do you need to defragment a Mac?

by Michael 2. February 2011 05:54

The purpose of this blog post is to provide some data about fragmentation on the Mac, that I've not seen researched/published elsewhere.

Mac OSX has a defragmenter in the file system itself. Given Mac is open-source, we looked at the code.

During a file open the files get defragmented if the following conditions are met:

1. The file is less than 20MB in size

2. There are more than 7 fragments

3. System has been up for more than 3 minutes

4. A regular file

5. File system is journaled

6. And the file system is not read-only.

So what's Apple's take on the subject? An Apple technical article states this:

Do I need to optimize?

You probably won't need to optimize at all if you use Mac OS X. Here's why:

  • Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn't need to fill up every "nook and cranny." Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.
  • Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.
  • Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 Panther can also automatically defragment such slow-growing files. This process is sometimes known as "Hot-File-Adaptive-Clustering."
  • Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.

For these reasons, there is little benefit to defragmenting.

Note: Mac OS X systems use hundreds of thousands of small files, many of which are rarely accessed. Optimizing them can be a major effort for very little practical gain. There is also a chance that one of the files placed in the "hot band" for rapid reads during system startup might be moved during defragmentation, which would decrease performance.

If your disks are almost full, and you often modify or create large files (such as editing video, but see the Tip below if you use iMovie and Mac OS X 10.3), there's a chance the disks could be fragmented. In this case, you might benefit from defragmentation, which can be performed with some third-party disk utilities. 
 

Here is my take on that information:

While I have no problem with the lead-in which states probably, the reasons are theoretical. Expressing theory and then an opinion on that theory is fine, so long as you properly indicate it is an opinion. The problem I do have with this is the last sentence before the notation, "For these reasons, there is little benefit to defragmenting.", or more clearly; passing off theory as fact.

Theory, and therefore "reasons" need to be substantiated by actual scientific processes that apply the theory and then either validate or invalidate it. Common examples we hear of theory-as-fact are statements like "SSDs don't have moving parts and don't need to be defragmented". Given our primary business is large enterprise corporations, we hear a lot of theory about the need (or lack thereof) of defragmenting complex and expensive storage systems. In all those cases, testing proves fragmentation (files, free space or both) slows computers down. The reasons sound logical, which dupes readers/listeners into believing the statements are true.

On that note, while the first three are logical, the last "reason" is most likely wrong. Block-based read-ahead caching is predicated on files being sequentially located/interleaved on the same disk "tracks". File-based read-ahead would still have to issue additional I/Os due to fragmentation. Fragmentation of data essentially breaks read-ahead efforts. Could the Mac be predicting file access and pre-loading files into memory well in advance of use, sure. If that's the case I could agree with the last point (i.e. "perceived system performance), but I find this unlikely (anyone reading this is welcome to comment).

They do also qualify the reason by stating "minor fragmentation", to which I would add that that minor fragmentation on Windows may not have "perceived" impact either.

I do agree with the final statement that states "you might benefit from defragmentation" when using large files, although I think might is too indecisive.

Where my opinion comes from:

A few years ago (spring/summer of 2009) we did a research project to understand how much fragmentation existed on Apple Macs. We wrote and sent out a fragmentation/performance analysis tool to select customers who also had Macs at their homes/businesses. We collected data from 198 volumes on 82 Macs (OSX 10.4.x & 10.5.x). 30 of those systems were in use between 1 – 2 years. 

                               

While system specifics are confidential (testers provided us the data under non-disclosure agreements) we found that free space fragmentation was particularly bad in many cases (worse than Windows). We also found an average of a little over 26,000 fragments per Mac, with an average expected performance gain from defrag of about 8%.Our research also found that the more severe cases of fragmentation, where we saw 70k/100k+ fragments, were low on available free space (substantiating that last paragraph in the Apple tech article).

This article also provide some fragmentation studies as well as performance tests. Their data also validates Apple's last paragraph and makes the "might benefit" statement a bit understated.

Your Mileage May Vary (YMMV): 

So, in summary I would recommend defragmenting your Mac. As with Windows, the benefit from defragmenting is proportionate to the amount of fragmentation. Defrag will help. The question is "does defrag help enough to spend the time and money?". The good thing is most Mac defragmenters, just like Diskeeper for Windows, have free demo versions you can trial to see if its worth spending money.

 

Here are some options: 

+ iDefrag (used by a former Diskeeper Corp employee who did graphic design on a Mac)

+ Drive Genius suite (a company we have spoken with in the past)

+ Stellar Drive Defrag (relatively new)

Perhaps this article begs the question/rumor "will there be a Diskeeper for Mac?", to which I would answer "unlikely, but not impossible". The reason is that we already have a very full development schedule with opportunities in other areas that we plan to pursue.

We are keeping an "i" on it though ;-).

An Open Letter to Global 1000 CIOs

by Lisa Terrenzi - Chief Executive Officer 28. January 2011 11:41

Dear CIO,

One of the most positive things to come out of the recession is a deeper appreciation of how much an efficient IT impacts corporate competiveness and profitability. And no wonder: envisioning and acting upon business opportunities requires having a powerful and flexible data center to work with.

In the real world, most networks are patchworks of legacy and cutting edge tech. Despite the best efforts of IT managers, IT investments take longer to recover and cost of operations creeps steadily upward.

Imagine a product that automatically ensured Windows systems ran at the peak speeds they were designed to deliver, reliably, using less energy and for a longer lifespan. Imagine the effect on operating costs and value return.

Hundreds of your peers in Global 1000 companies and Federal and State agencies don’t have to imagine this because this product exists and they refuse to run their networks without it: Diskeeper performance technology for Windows systems.

I am the CEO of the company that makes Diskeeper and I run the team behind the product. We’ve been around for 30 years; we’re in for the long term and we’d like an opportunity to help you achieve your IT goals faster.  

A simple evaluation on your IT network will show you the Diskeeper business value far better than I can say. Here is a link to quickly download a trial evaluation of Diskeeper. I suggest you have your IT manager test it out and report back to you. https://www.diskeeper.com/landing/diskeeper-30-day-trialware.aspx

Best,

Lisa Terrenzi

CEO

Diskeeper Corporation   

Tags: , , , , ,

Defrag | Diskeeper

Faster Backups/Archiving/Dedupe/DR success with Diskeeper and V-locity

by Colleen Toumayan 27. January 2011 03:29

"Spokane Regional Health District uses CommVault Simpana backup/archiving/disaster recovery software installed on a dedicated server with 37TB of SAS attached storage.

                                                                                       

We perform daily full and incremental backups of all our servers. The data backup is disk-to-disk-to-tape and is deduplicated as it is saved on the SAS storage. The deduplication process can create a very large number of file fragments, sometimes over 1,540,000 fragments on a 2TB disk array. With Diskeeper EnterpriseServer automatic defrag running the response time of the arrays is approaching 0.02 second delay due to fragmentation. This has reduced our backup time by approximately 25 percent for any D2D2T job. 

SRHD also uses Microsoft Hyper-V and currently has 31 virtualized servers running on an Intel Modular Server. There are 72TB of storage available to the Modular Server via SAS connections featuring dual path IO. All of the data on the SAS arrays is maintained in RAID 60 logical disk drives. Since setting up V-locity, which has built-in support for VHD (virtual hard disks), with automatic defragmentation, our VHDs very seldom show any fragmentation. 

                                                         

The solutions also have the intelligence to monitor disk IO and the defragmentation will pause to prevent IO latency affecting performance. They are set and forget applications which perform a very well without impact on our server response times."

-Larry Smith, Spokane Regional Health District

Tags:

Defrag | Diskeeper | SAN | V-Locity

Diskeeper 2011 outmuscles fragmentation

by Colleen Toumayan 12. January 2011 09:29

Bruce Pechman the "Muscleman of Technology" stopped by out booth at a CES media event. We talked a bit about the upcoming new Diskeeper release and he was kind enough to express his enthusiasm to us in writing so we could publish it:

As a TV Technology Journalist, high-end computer enthusiast, hard core gamer, and a person who is consistently buying the fastest computer money can buy, I need my computers to have super fast performance without any bottle necks every minute of every day.

From expensive SSD’s, to traditional rotational Hard Disk Drives—in every combination you can think of, my computers always run flawlessly and speedily thanks to an awesome software program called Diskeeper!

Right now my primary computer is the “Maingear Shift” with an Intel i980 Extreme Processor overclocked to 4.3 Ghz, it’s almost $6,000 and I can’t tolerate any slowdowns from my dual Western Digital 10,000 RPM VelociRaptors hard drives.

The facts are really quite simple. All types of computers will experience disk fragmentation and it can definitely worsen over time if nothing is done to prevent it. Disk fragmentation is the culprit for many annoying computer symptoms such as slow application performance, long boot times, unexplained system slow downs, crashes, etc. Diskeeper prevents these issues from cropping up.I have been religiously using Diskeeper over the years so my computers can realize the full benefits of system performance and reliability...and I can’t wait to install the new Diskeeper 2011—just set it and “forget about it”!

Whether I’m on deadline, overclocking an Intel Extreme processor, or playing Mafia II with every performance benchmark set to maximum settings, Diskeeper is a must install on every computer I own. 

Bruce Pechman, The Muscleman of Technology (www.MrBicep.com) America’s Best-Built Technology & Fitness Television Personality Muscleman of Technology, LLC Bruce Pechman, The Muscleman of Technology® (Muscleman of Technology, LLC), is a Consumer Technology and Fitness Personality appearing regularly on “Good Morning San Diego” (KUSI-TV) and the KTLA Morning Show in Hollywood. Bruce’s moniker reflects his 20 years of industry technology expertise, and 30 years of fitness training. He has made over 250 live TV appearances on major network television. Visit him at www.mrbicep.com 

Tags:

Defrag | Diskeeper | SSD, Solid State, Flash

Defragmenting IT Healthcare

by Michael 20. December 2010 05:18

Joe Marion is founder and Principal of Healthcare Integration Strategies, specializing in the integration of imaging technologies with the overall healthcare IT landscape. His blog (at Healthcare Informatics) covers challenges and opportunities specifically relevant to optimizing Healthcare IT initiatives.

Medical images are a significant percentage of the the world's storage requirements, and have been predicted to encompass an even greater percentage of future storage demand. In Joe's recent blog post he posed the question "Is Defragmentation a Boon to Healthcare IT Performance?"

In his post he includes personal observations and insight into performance implications fragmentation can incur on IT as healthcare departments themselves consolidate and standardize application use:

"With departmental solutions, there very likely was less emphasis on system tools such as defragmentation applications.  Now that PACS technology is becoming more intertwined with the rest of IT, there should be greater emphasis on inclusion of these tools.  In addition, server virtualization can mean that previously independent applications are now part of a virtual server farm."

He also makes the astute observation that centralizing computing and storage magnifies bottlenecks, making a solution such as defragmentation increasingly more vital:

"The addition of disk-intensive applications such as speech recognition and imaging could potentially impact the overall performance of these applications.  As data storage requirements within healthcare grow, the problem will potentially get worse.  Think of the consequence of managing multiple 3000-slice CT studies and performing multiple 3D analyses.  As more advanced visualization applications go the client-server route, the performance of a central server doing the 3D processing could be significantly impacted."

You can read Joe's blog here.

  

Tags: , , ,

Defrag | Diskeeper | IntelliWrite | V-Locity

RecentComments

Comment RSS

Month List

Calendar

<<  July 2018  >>
MoTuWeThFrSaSu
2526272829301
2345678
9101112131415
16171819202122
23242526272829
303112345

View posts in large calendar