Data Center

Citrix performance tuning part 1

Performance Tuning Citrix XenApp and XenDesktop 7.6 Part I: Citrix CloudBridge

By | Cloud Computing, Data Center, Replication, Storage | No Comments

As many companies are making the investment in Citrix XenApp and XenDesktop they want a fast, reliable and secure solution. In this article, we will focus on WAN optimization and performance using the Citrix CloudBridge technologies.

Citrix CloudBridge is a unified platform used to accelerate applications across public and private networks, increasing performance and user experience. CloudBridge offers ICA protocol acceleration, QoS, optimization and security for XenApp and XenDesktop. This is an optimal solution for remote or branch offices that have WAN performance issues. CloudBridge offers extensive monitoring and reporting features, which help IT staff to performance tune any Citrix environment.
Read More

Why Businesses are Avoiding Cloud Services Adoption

Cloud Breakdown Part 1: Why Businesses are Avoiding Cloud Services Adoption

By | Cloud Computing, Data Center, Virtualization | No Comments

No one will argue that IT organizations today are tasked to do more with less in terms of resources – whether it’s hardware/software or people to support the infrastructure. Private cloud has been a focal point of conversations that we are frequently having with our customers. Some are highly virtualized and others are trying to get their arms around how to manage the sprawling cloud technologies that are knocking at their door. Read More

The Hotel California Dilemma with Hyperscale Cloud

The Hotel California Dilemma with Hyperscale Cloud

By | Cloud Computing, Data Center, Strategy | No Comments

At IDS, we are constantly speaking with IT organizations about our IDS Cloud offering and the marketplace in general. For those of you who don’t know, three years ago we built our own IDS Cloud offering based on the latest FlexPod and Vblock technology. Our Cloud is available with roughly 4PB of storage across geographically-dispersed Data Centers. This was a massive investment for IDS that resulted in a valuable ongoing opportunity for our team. I think everyone at IDS has learned many lessons over the past year surrounding the value of our offering versus the Hyperscale Cloud providers like Amazon and Azure. These lessons have shaped our ability to offer a truly valuable cloud offering to our customers. Read More

ONTAP 8.3 Update: QOS Commands, Consider Using Them Today

ONTAP 8.3 Update: QOS Commands, Consider Using Them Today

By | Data Center, How To, NetApp | No Comments

NetApp has included some very powerful troubleshooting commands with the 8.3 update which I’d like to bring to your attention: its QOS statistics and its subcommands. Prior to 8.3, we used the dashboard command to view statistics at the cluster node level. The problem with dashboard is that it’s reporting on cluster level statistics and it can be difficult to isolate problems caused by a single object. The advantage of the QOS command is that we now have the ability to target specific objects in a very granular fashion. Read More

nine big data takeaways for 2015

Nine Big Data Facts to Guide Your Technology Resolutions

By | Data Center, Storage | No Comments

We recently came across one of Gartner’s newest reports, “Major Myths About Big Data’s Impact on Information Infrastructure,” which explores nine big data myths. An abundance of marketplace hype and news about big data has caused a series of misconceptions surrounding the hot topic. To help straighten out the confusion, Gartner completed valuable research around nine major areas. We read through the lengthy report and compiled the biggest takeaways for you.

If you’d prefer to wade through the report on your own, check out the full Gartner research report instead.

Nine Big Data Takeaways for 2015

  1. Everyone has not already adopted big data. Many organizations exist in a state of fear that they will be the last to develop and implement big data strategies. Thanks to Gartner’s research, we know that in reality only 13% of organizations have actually fully adopted big data solutions. On the flip side, the research also reveals that close to 25% of organizations have no plans to invest in big data at all.
  2. Quantity doesn’t trump quality. Organizations are becoming so obsessed with collecting big data, they’ve forgotten about the importance of quality. Gartner explains that many organizations think that larger data sets mean flaws have less of an impact. In actuality, the number of flaws grows as the data does, meaning the overall impact of errors is unchanged. IT organizations need to put just as much effort, if not more, into big data quality strategy.
  3. Big data won’t eliminate data integration. According to Gartner, using new technologies to replace some data integration steps won’t eliminate the need for integration; it will just change the design. Gartner recommends assessing the existing architecture to pinpoint areas for improvement as top priorities for new engineering approaches.
  4. Data warehouses will still be relevant. While a lot of hype surrounding big data seems to suggest there’s no longer a need for data warehouses, Gartner reports that’s simply not true. According to Gartner, organizations don’t necessarily need to utilize data warehouses during the experimentation phase, but they should still implement them afterwards. Gartner points out that curated data leads to the best scoring and risk models, keeping data warehouses relevant.
  5. Data lakes will not replace data warehouses. Critics are also arguing that data lakes will replace data warehouses. Garter responds by suggesting that data lakes aren’t quite as easy as they seem. They recommend capitalizing on already-successful data warehouses, instead of dealing with the lack of analytical and data-manipulation skills necessary for data lakes.
  6. Hadoop will not replace data warehouses. With a third theory, critics suggest that Hadoop will replace data warehouses, but Gartner dispels this one too. Gartner reports that fewer than 5% or organizations are actually planning on replacing data centers with Hadoop, and the number is actually decreasing.
  7. Big data has not fully matured. As specified earlier, only 13% of organizations have actually deployed big data solutions. While big data is growing, it’s not a mature market and there is still an incredible wealth of information that is unknown, making risk very realizable.
  8. Big data is not the end-all, be-all. While big data seems big, it’s not the first opportunity for IT development. Gartner’s research shows that processing capacity doubles every 22 to 28 months, memory capacity doubles each year and every five years storage improves in efficiency, among other advancements. Instead of reacting in a panic, Garter suggests that organizations should calm down and develop a logical plan that works for them.
  9. You need to keep data governance in mind. Because big data is just like any other data, only larger, organizations still need to employ data governance initiatives. Gartner mentions that as organizations implement big data they should re-evaluate the current data governance structure and make sure it is flexible enough to meet the criteria needed.
How to Create a Technology Roadmap

Developing Successful Technology Roadmaps for Your Organization

By | Data Center, Design & Architecture, How To, Strategy | No Comments

This is the second of a two-part series on Technology Roadmaps. Previously we explained “The Concepts behind a Technology Roadmap,” and here we explain how to develop one. 

Technology roadmaps begin with a “handshake” between IT and the business. Knowing future business plans allows IT to determine the focus area(s). As businesses evolve and new technologies emerge, IT is challenged with constant change. Developing roadmaps helps IT to be prepared for the change and manage the associated risks.

How Do You Create a Technology Roadmap?

  1. Collect Data. Take the time to gather preliminary information about products, people and processes. Understand current implementations and directions.
  2. Hold Interviews. Identify key stakeholders and gain different perspectives. Meet individually or in groups, and be sure to cover topics like resources, costs, risk, compliance, growth, skills, support and management.
  3. Create technology baselines. Document the essentials and highlight the constraints. Stay sufficiently high-level, but acknowledge the details around recent changes.
  4. Analyze focus areas. Use a structured method for the analysis. One of the most widely used framework in business analysis is the SWOT (Strength-Weakness-Opportunities-Threats) model. Since opportunities and threats relate to the industry at large, it is important to have subject matter experts (SMEs) provide input at this stage.
  5. Construct technology roadmaps. This is a collaborative exercise incorporating the inclusion of emerging technologies over several years. This does not always have to be a chart or a graph. It can be as simple as an enumeration of important technology adoptions in stages. For best results, use a backward sweep starting from end objectives, and then a forward sweep showing how adopting a technology at each stage can lead to the end objective. Continue this same pattern until you get it just right.
  6. Present recommendations. Knowing the roadmaps enables you to enumerate the IT projects that need attention in the coming months. There should also be clarity on the investment needed in terms of budget, time and resources.
  7. Host a workshop. Facilitate a workshop where key stakeholders meet again to review the results. This is a necessary touch point to discuss the project-based initiatives and make any final adjustments to the course.

How effective are Technology Roadmaps?

It all depends on the people and the effort put into the exercise. As indicated in the first part of this two-part series, technology roadmaps bring consensus and improved planning, budgeting & coordination. It is critical that organizations treat this as a project in itself, and provide the necessary funds and resources.

While an internal committee may be established to execute such a project, the benefits of technology roadmaps multiply exponentially when an external partner, like IDS, is involved. IDS guarantees a proven process with expert methodology, and key insight on the final deliverable. A partner like IDS can pre-empt much of the struggle by bringing SMEs to the table and a fresh external perspective.

And remember: As businesses and technologies evolve, so will the roadmaps. So, review them often.

Learn more by reading the first part of this two-part series, “The Concepts Behind a Technology Roadmap.”

Five signs it's time to invest in your data center

Five Signs It’s Time to Invest in Your Data Center

By | Data Center, Design & Architecture, How To | No Comments

In an industry where technology development and advancement moves incredibly fast, even top CIOs may feel like it’s impossible to keep up. While they may feel like their organization is falling behind, how do they determine whether they really are? What counts as “up to date” in our constantly evolving IT landscape, and is that even good enough? It’s easy to let Data Centers get out of control, and unfortunately it’s a risky business to do so. We’ve compiled the five top signs that your Data Center needs some investment to help cut through the confusion. See one or a few things that sound eerily familiar on this list? It may be time for a Data Center upgrade.

Five Signs Your Data Center Needs an Upgrade

  1. Your data center feels like a desert. If you’re carrying around a personal fan while walking through your Data Center, you’re definitely losing the Data Center cooling battle. Some recommend looking at a Computational Fluid Dynamics (CFD) analysis to assist in cooling system arrangements, and hot and cold aisle containment. If your Data Center continuously suffers from heat stroke, it’s probably not operating to the highest capacity possible.
  2. You skipped spring-cleaning the last 10 years. While it’s easy to let gear pile up, it’s vital to complete some fundamental analysis when it comes to the hardware in your Data Center. Equipment that no longer adds value, or is simply not being used should be thrown away or donated to a non-profit organization like a school for example. Discarding old equipment can have countless benefits including increased power capacity and clearing valuable space.
  3. Your server lifecycle was up three cycles ago. There are multiple reasons why a server lifecycle may come to a close. Because server lifecycles can vary greatly, deciphering that usable life can be confusing based on legacy applications and operating systems. We follow a general rule of thumb that if the server can no longer meet your required needs after 3 years, replacement or an alternative solution will likely make sense over simple upgrades. Replacing old servers, or incorporating innovative technologies like virtualization, cloud-based services, and converged infrastructure can help consolidate and optimize the Data Center. In turn, consolidating the Data Center can reduce cabling, management, heating, cooling and ongoing maintenance costs.
  4. Your cabling looks like a rat’s nest. Cabling can easily consume a Data Center if it’s not managed properly. If you’re not labeling, tying down and properly organizing your Data Center cabling, you need a serious revamp of this vital part of the Data Center. This type of disorganization can even lead to human error that can cause downtime to business-critical applications. If a wrongly placed elbow could take your retail business offline for multiple days, it’s time to rethink your cabling strategy. In addition to organization, converged technologies can greatly decrease the cabling in your Data Center.
  5. People are walking around your data center and you don’t know who they are. If you’re finding strangers meandering through your Data Center, it’s probably time to consider the physical security and current measures in place to protect your valuable applications and data. While you may not need a full-time guard dog, your organization may consider implementing key card access, security cameras and a sign in, sign out process with regular audits. Keep in mind, the biggest threat can often come from within your organization, so checks and balances are critical. Moving your infrastructure services to the Cloud or colocation facilities can allow you to leverage enterprise-class security and controls without massive capital investment upfront.

Even with the tips above, determining when and how to update your Data Center can be a difficult decision. It’s often a good idea to bring in a third party for a Data Center assessment consultation to make sure you’re receiving unbiased feedback. Taking the time to properly assess your current Data Center infrastructure and plan an integrative upgrade will help deter hasty decisions, and ultimately save critical capital.