cancel
Showing results for 
Search instead for 
Did you mean: 

Signs you need a Data Agility Plan. Soon !

ASigismondi
Level 2
Employee

One hot topic that every modern business has on their radar for 2016 is Business Agility. In fact, reflecting on all my recent customer visits, no matter what my business conversations start with, I often end up discussing lack of Business Agility in many different forms.

 

I found that IT organizations are dealing more than ever with a significant amount of challenges when it comes to being agile in supporting the company business objectives, but if there is one huge elephant in the room that I would like to address today when it comes to challenges, that is Data Agility. For me Data Agility is for Business Agility what water is for humans. Without it, humans wouldn’t exist. As humans we can have a nice house, luxury furniture, all possible services available and even food… but without Data Agility, (e.g. running Water…) all that investment in our home infrastructure would be worthless. Too extreme as analogy?

 

I would like to share (and also hear from you) possible “symptoms", based on my findings, that you could have a Data Agility problem. Before I continue lets narrow the symptoms to a common use case:

 

Your organization has an in-house business critical application, supported by a team of internal developers. This application leverages large DB’s and files that are also used by other mission critical (but secondary) functions like Backup and say, business analytics applications. Someone or something needs to make sure these secondary functions access the DB data in order to enable them. Pretty common case right? I hope you would agree.

 

1.Time and resources to provision data

Your primary application needs to be developed, maintained and tested by your development team. It’s no surprise that the development team will frequently ask for DB copies or a DB refresh. I have found that organizations use a significant amount of resources to move a single DB from the production environment to the development environment. Sometimes these environments are actually in different networks and are often isolated, making things even harder. In the worse cases (but very common) a developer will wait weeks (!?!) before he/she gets her DB refreshed, and the entire process from the request to the delivery will touch resources from DBAs, Backup, Storage, Networking and Ops. Ouch! That’s extremely inefficient and costly.

 

2.Your valuable skilled resources spend more time doing repetitive, low value tasks

You have highly paid skilled resources forced to spend valuable time managing data provisioning requests (e.g. making copies, running scripts for masking data), instead of producing something new or being innovative. No one really think about how they got to this stage, it just happened. The worse case scenario is losing those talented employees for lower skilled personnel that at the end represent a decrease in your human capital value.

 

3.Your business isn't capable to deliver fast enough to keep up with the market demands

“Our competitors are able to deliver applications to market faster”, “Our engineering testing processes are slow and old fashioned”, “Churning in engineering is high for some reason, I guess everyone wants to work at Google or Facebook..”.  I don’t want to open the DevOps discussion with you today (We will…) but based on my findings, lack of Data Agility could be a good chunk of the problem.

4.Storage spending keeps increasing

Do you really don’t know or you don’t want to know? You are more than likely spending a significant portion on your budget on storage to house copies of production data that are provisioned multiple times to serve many different business critical applications.

 

5.Once data is out of control of the data owner, risks are unknown

In discussing with many DBAs about what is their involvement from a security and control perspective once they make copies of critical DB’s to enable other IT functions, the majority of them actually… Do not care. They don’t mind what happens to that data once they have passed it over. However, they do care about backup (their insurance policy) and also making sure DB data is properly masked before someone gets a new copy, but after that they often struggle to identify what the end-user does. Are they making new copies? Are they mounting the DB into secure systems? Are they providing access to people they shouldn’t? This is bad news – not being able to have visibility into who has a copy, how many copies exist and who has access to them is something we should have under control all the times.

 

6.Agents, automated scripts, etc. overload your major Applications servers For the only purpose to extract its data.  I’ve found that many organizations have made “Band-Aid” decisions when they are suffering, and the pain and urgency do not allow them usually to make better long-term decisions. I have seen Application servers overloaded by Backup agents, snapshot scripts, etc. augmenting the risk of failure. Current mood is: if it’s working, don’t touch it! 

 

So, at the end we can conclude that somehow we are all data slaves. We shouldn’t work for our data, our data should work for us. 

 

Now is your time. What other symptoms do you see around?