Condusiv Technologies Blog

Condusiv Technologies Blog

Blogging @Condusiv

The Condusiv blog shares insight into the issues surrounding system and application performance—and how I/O optimization software is breaking new ground in solving those issues.

How to Recover Deleted Files from Network Shares

by Dawn Richcreek 17. October 2019 04:09

You may have discovered—and too late—that while you can recover some deleted files from the Windows Recycle Bin on local machines, you cannot recover deleted files (accidentally or otherwise) from network drive shared folders. If you delete a file from a network share, it is gone. If you look in the Recycle Bin, it won’t be there. 

This happens because Windows is organized so that deleted files can be captured by the Windows Recycle bin on local drives only. If a user deletes a file on a server from a network shared folder, it isn’t being deleted from the local machine, so the Recycle Bin does not capture it. This is also true of files deleted from attached or removable drives, and files deleted from applications or the Command Prompt. Only files deleted from File Explorer on a machine’s local drive will be saved by the Recycle Bin.

With some types of software, you might be able to recover an earlier saved version of a file deleted from a network shared folder, which would give you the version prior to the deletion. Failing this, the only other way to recover a file deleted from a network share (without a third-party solution—see below) is to have your system administrator retrieve an earlier saved version of the file from the most recent backup. This will only work if:

 

a) A version of the file was actually backed up

b) You can recall the file name so that the system administrator can find it

c) You can recall with some accuracy the time and date when the file was saved. 

 

This method is, of course, extremely time consuming for the sys admin—and for you, too, if you have to wait. 

Even if the previous version can be retrieved, any work done on the file since the last save is lost forever. 

 

Problem Solved: Undelete

Fortunately, there is a very easy and cost-effective solution to this perpetual issue: Undelete® Instant Data Recovery software from Condusiv. 

1. To permanently solve this problem site-wide, download and install Undelete Server, which is extremely fast and simple, and doesn’t require a reboot to complete the installation (something you really don’t want to have to do on a server running databases or applications requiring constant uptime). 

 

 

2. Following installation, the first thing you’ll notice is that the Windows Recycle Bin has been replaced by the Undelete Recovery Bin. The Recovery Bin will not only capture files deleted from network shares, but also files overwritten on the user’s drive, files deleted between backups, and files deleted from the Command Prompt. 

3. Test it for yourself. Create a test file within a network drive shared folder and delete it. You’ll see that your file has, as you would expect, disappeared from the server as well. 

4. Open the Undelete Recovery Bin. You’ll be able to easily navigate to the shared folder from which you deleted the file—and there you’ll find it again. (if you are not an Admin, see Undelete Client below)

 

 

5. You can then select that file and recover it back to its original location, or even to a new location. 

 

 

 

 6. You’re done! That’s how easy it is.

 

Undelete Client

The above example demonstrates a user opening Undelete Server on the respective server to recover the file. Users, however, may not have access to the server, but a system administrator can certainly log on and open Undelete Server to recover the file. 

However, once Undelete is installed on a system, a user can open Undelete on the remote Network Share, follow the above steps, and view and recover their own files. 

 

Buy Undelete Instant Data Recovery now, and always be able to recover deleted files from network shares. 

 

Purchase Online Now https://www.condusiv.com/purchase/Undelete/

 

Request a Volume Quote https://learn.condusiv.com/Volume-Licensing-Undelete.html

 

Download a Free trial https://learn.condusiv.com/LP-Trialware-Undelete.html

Tags:

Data Recovery

Ransomware Protection Tips

by Gary Quan 7. October 2019 05:47

You hope that your systems never get attacked by Ransomware, but in case you do, you want to be prepared. One of the best ways to recover from such a malicious attack is to ensure you keep good and recent backups of your systems. But even with that, you can only recover back to the last known good backup. What about the files worked on since that last good backup? To fully recover from a Ransomware attack, you want those files recovered too. This is where Undelete® instant file recovery software can help, when set up properly. 

Undelete can provide a further level of recovery with its versioning and deleted file protection 

Undelete’s versioning capability can keep copies of files worked on since that last backup, plus any files created and deleted since that last backup.  This can help you recover new or updated files since the last backup completed. This latter feature of capturing deleted files can be extremely beneficial as there are some variants of Ransomware that copy the original files to an encrypted form, then delete the original file. In these cases, many of the deleted original files may be in Undelete Recovery Bin and available for recovery.  

But what about protecting the Undelete Recovery Bin from the Ransomware attack?

This is where the Common Recovery Bin feature can help.  By default, Undelete creates a Recovery Bin folder on each volume that it is protecting. All the versioned and deleted files from each volume are stored in the Recovery Bin folder on the respective volumes. With the Common Recovery Bin feature, you can select a single location on a different volume that will contain all of the versioned and deleted files from all of your protected volumes.  For example, you may want to set up a dedicated X: volume that contains the Recovery Bin files from all of the protected volumes. So, even if your main system volumes get affected by Ransomware, these other volumes may remain safe.  This is not a fail-safe protection against Ransomware, but just another deterrent against these Recovery Bin files getting infected. 

 

 

Athough you may have purchased or tried Undelete for its file recovery features for accidental user file deletions from local or network shares, it can also provide added recovery benefits from malicious attacks.

If you need additional Undelete licenses, you can contact your account manager or buy instantly online.

You can also download a free 30-day trial of Undelete.

Learn more about Undelete from this series of videos.

Condusiv’s V-locity Technology Was Recently Certified as Citrix Ready

by Dawn Richcreek 11. September 2019 09:51

 

We are proud to announce that Condusiv’s V-locity® I/O reduction software has been certified as Citrix Ready®. The Citrix Ready program helps customers identify third-party solutions that enhance virtualization, networking and cloud computing solutions from Citrix Systems, Inc. V-locity, our innovative and dynamic alternative to costly hardware overhauls, has completed a rigorous verification process to ensure compatibility with Citrix solutions, providing confidence in joint solution efficiency and value. The Citrix Ready program makes it easy for customers to identify complementary products and results-driven solutions that can enhance Citrix environments and increase productivity.

 

 

 

Verified Performance Improvements of 50 Percent or More

To obtain the Citrix Ready certification, we ran IOMeter benchmark tests—an industry standard tool for testing I/O performance—on a Windows 10 system powered by Citrix’s XenDesktop virtual desktop access (VDA).  

The IOMeter benchmark utility was set up to run 5 different tests with variations in the following parameters:

 •  Different read/write size packets (512b to 64kb)
 •  Different read/write ratios, i.e. 50% read/50% writes, 75% reads/25% writes
 •  Different mixture of random and sequential I/Os

The tests determined that drastic improvements were made with V-locity enabled versus disabled. With V-locity enabled, we found that performance rates improved around 50% on average. In one test case, IOps (I/Os per second) increased from 2,903 to 5,525, a performance rate improvement of 90%.  

 

 

 

 This chart shows the detailed test results of the 5 test variations:  

 

 

 

We also compared the results of the V-locity Dashboard running the same IOMeter benchmark, with V-locity disabled and then enabled and found some additional improvements.

With V-locity enabled, it was able to eliminate over 8 million I/Os from having to go through the network and storage to get satisfied which immensely increased the I/O capacity of the system.  By knowing the latency times of these ‘eliminated’ I/Os, another improvement to highlight is that it saved more than an hour of storage I/O time.   

 

 

 

Additionally, the workload (amount of data read/written) increased from 169GB to 273GB, meaning 60% more work was being done in the same amount of time.  

 

 

 

 

 

Customers can be confident that V-locity has successfully passed an exhaustive series of tests established by Citrix. The V-locity technology works effectively with Citrix solutions and can provide customers with 50% or more faster performance gain on their heaviest workloads. V-locity allows customers to “set it and forget it,” meaning that once it is installed, systems will instantly improve with little to no maintenance.

Our CEO, Jim D’Arezzo noted, “We are proud to partner with Citrix Systems. It’s important to remember that most I/O performance issues are caused by the operating system, particularly in the Windows environment. When compared to a hardware upgrade, the software solutions Condusiv offers are far more effective—both in terms of cost and result—in increasing overall system performance. We offer customers intelligent solutions that now combine our V-locity with Citrix XenDesk. We can’t wait to continue to work with the trusted partners in the Citrix Ready ecosystem.” 

 

Download free 30-day trial of V-locity

 

Condusiv’s V-locity I/O reduction software has been certified as Citrix Ready

 

 

Case Study: Non-Profit Eliminates Frustrating Help Desk calls, Boosts Performance and Extends Useful Hardware Lifecycle

by Marissa Newman 9. September 2019 11:47

When PathPoint was faced with user complaints and productivity issues related to slow performance, the non-profit organization turned to Condusiv’s I/O reduction software to not only optimize their physical and virtual infrastructure but to extend their hardware lifecycles, as well. 

As technology became more relevant to PathPoint’s growing organization and mission of providing people with disabilities and young adults the skills and resources to set them up for success, the IT team had to find a solution to make the IT infrastructure as efficient as possible. That’s when the organization looked into Diskeeper® as a solution for their physical servers and desktops.

“Now when we are configuring our workstations and laptops, the first thing we do is install Diskeeper. We have several lab computers that we don’t put the software on and the difference is obvious in day-to-day functionality. Diskeeper has essentially eliminated all helpdesk calls related to sluggish performance.” reported Curt Dennett, PathPoint’s VP of Technology and Infrastructure.

Curt also found that workstations with Diskeeper installed have a 5-year lifecycle versus the lab computers without Diskeeper that only last 3 years and he found similar results on his physical servers that are running full production workloads. Curt observed, “We don’t need to re-format machines running Diskeeper nearly as often. As a result, we gained back valuable time for other important initiatives while securing peak performance and longevity out of our physical hardware assets. With limited budgets, that has truly put us at ease.”

When PathPoint expanded into the virtual realm, Curt looked at V-locity® for their VM’s and, after reviewing the benefits, brought the software into the rest of their environment. The organization found that with the powerful capabilities of Diskeeper and V-locity, they were able to offload 47% of I/O traffic from storage, resulting in a much faster experience for their users.

The use of V-locity and Diskeeper is now the standard for PathPoint. Curt concluded, “The numbers are impressive but what’s more for me, is the gut feeling and the experience of knowing that the machines are actually performing efficiently. I wouldn’t run any environment without these tools.”

 

Read the full case study

 

Try V-locity FREE for yourself – no reboot is needed

Cost-Effective Solutions for Healthcare IT Deficiencies

by Jim D’Arezzo, CEO 26. August 2019 05:22

Managing healthcare these days is as much about managing data as it is about managing patients themselves.  The tsunami of data washing over the healthcare industry is a result of technological advancements and regulatory requirements coming together in a perfect storm.  But when it comes to saving lives, the healthcare industry cannot allow IT deficiencies to become the problem rather than the solution.

The healthcare system generates about a zettabyte (a trillion gigabytes) of data each year, with sources including electronic health records (EHRs), diagnostics, genetics, wearable devices and much more. While this data can help improve our health, reduce healthcare costs and predict diseases and epidemics, the technology used to process and analyze it is a major factor in its value.

According to a recent report from International Data Corporation, the volume of data processed in the overall healthcare sector is projected to increase at a compound annual growth rate of 36 percent through 2025, significantly faster than in other data-intensive industries such as manufacturing (30 percent projected CAGR), financial services (26 percent) and media and entertainment (25 percent).

Healthcare faces many challenges, but one that cannot be ignored is information technology. Without adequate technology to handle this growing tsunami of often-complex data, medical professionals and scientists can’t do their jobs. And without that, we all pay the price.

Electronic Health Records

Over the last 30 years, healthcare organizations have moved toward digital patient records, with 96 percent of U.S. hospitals and 78 percent of physician’s offices now using EHRs, according to the National Academy of Medicine. A recent report from market research firm Kalorama Information states that the EHR market topped $31.5 billion in 2018, up 6 percent from 2017.

Ten years ago, Congress passed the Health Information Technology for Economic and Clinical Health (HITECH) Act and invested $40 billion in health IT implementation.

The adoption of EHRs is supposed to be a solution, but instead it is straining an overburdened healthcare IT infrastructure. This is largely because of the lack of interoperability among the more than 700 EHR providers. Healthcare organizations, primarily hospitals and physicians’ offices, end up with duplicate EHR data that requires extensive (not to mention non-productive) search and retrieval, which degrades IT system performance.

More Data, More Problems

IT departments are struggling to keep up with demand.  Like the proverbial Dutch boy with his finger in the dyke, it is difficult for IT staff to manage the sheer amount of data, much less the performance demands of users.

We can all relate to this problem.  All of us are users of massive amounts of data.  We also have little patience for slow downloads, uploads, processing or wait times for systems to refresh. IT departments are generally measured on three fundamentals: the efficacy of the applications they provide to end users, uptime of systems and speed (user experience).  The applications are getting more robust, systems are generally more reliable, but speed (performance) is a constant challenge that can get worse by the day.

From an IT investment perspective, improvements in technology have given us much faster networks, much faster processing and huge amounts of storage.  Virtualization of the traditional client-server IT model has provided massive cost savings.  And new hyperconverged systems can improve performance as well in certain instances.  Cloud computing has given us economies of scale. 

But costs will not easily be contained as the mounting waves of data continue to pound against the IT breakwaters.   

Containing IT Costs

Traditional thinking about IT investments goes like this.  We need more compute power; we buy more systems.  We need faster network speeds; we increase network bandwidth and buy the hardware that goes with it.  We need more storage; we buy more hardware.  Costs continue to rise proportionate to the demand for the three fundamentals (applications, uptime and speed).

However, there are solutions that can help contain IT costs.  Data Center Infrastructure Management (DCIM) software has become an effective tool for analyzing and then reducing the overall cost of IT.  In fact, the US government Data Center Optimization Initiative claims to have saved nearly $2 billion since 2016.

Other solutions that don’t require new hardware to improve performance and extend the life of existing systems are also available. 

What is often overlooked is that processing and analyzing data is dependent on the overall system’s input/output (I/O) performance, also known as throughput. Many large organizations performing data analytics require a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.

In the Windows environment especially (which runs about 80% of the world’s computers), I/O performance degradation progresses over time. This degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. Windows penalizes optimum performance due to server inefficiencies in the handoff of data to storage. This occurs in any data center, whether it is in the cloud or on premises.  And it gets worse in a virtualized computing environment.  In a virtual environment the multitude of systems all sending I/O up and down the stack to and from storage create tiny, fractured, random I/O that results in a “noisy” environment that slows down application performance.  Left untreated, it only worsens with time.

Even experienced IT professionals mistakenly think that new hardware will solve these problems. Since data is so essential to running organizations, they are tempted to throw money at the problem by buying expensive new hardware.  While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 30 to 50 percent or more.  Software like this has the advantage of being non-disruptive (no ripping and replacing hardware), and it can be transparent to end users as it is added in the background.  Thus, a software solution can handle more data by eliminating overhead, increase performance at a much, much lower cost and extend the life of existing systems. 

With the tsunami of data threatening IT, solutions like these should be considered in order to contain healthcare IT costs.


Download V-locity - I/O Reduction Software  

Tags:

Application Performance | EHR

RecentComments

Comment RSS

Month List

Calendar

<<  October 2019  >>
MoTuWeThFrSaSu
30123456
78910111213
14151617181920
21222324252627
28293031123
45678910

View posts in large calendar