Data governance, the management of files and their accessibility in an organisation, has become an area of significant concern for many modern enterprises – especially those expanding their size, scope, and/or premises.
The storage capacities for servers, both in the cloud and on the network, are now being measured in terabytes. If unchecked, this allows project files to be replicated throughout a network. In conjunction with this, unrestricted access to project folders can lead to further risks of data exposure.
Ideally, organisations should have enforceable policies regarding their data governance embedded in their management systems, supported by an adequate telecommunications infrastructure and the appropriate collaborative tools. Unfortunately, this is not always the case, which can lead to conflicting information and project delays.
“Data governance on large projects is a challenge,” says Neil Thompson, director for digital construction at design, engineering and project management consultancy Atkins. “We have to make sure we transmit the right information, to the right person, at the right time and at the appropriate resolution.”
The warnings of poor data governance frequently come too late, usually at critical moments in a project lifecycle. Such warnings can present themselves as vital information missing from projects, conflicting project data, employees not having access to information, or employees/former employees still having access to information that they no longer need to know, following a job change.
“Data governance is an internal problem for all sorts of businesses – I have come across these issues in design, with businesses like architects, and I see it an awful lot in software development,” says Colin Tankard, managing director of security services firm Digital Pathways.
“Where you have different teams working on a project together, and they are all working on their own little bit – how do you know you are working on the latest release?”
Duplication of data
The most frequent example of poor data governance is in the duplication of project data, where multiple copies of the same file can be found across an organisation’s internal network. This is often for the purposes of accessibility, or if a development team needs to reference a file that is being worked on elsewhere.
“The difficulty is not necessarily the shared data we manage in the cloud, it’s how we manage ‘work in progress’ data, to ensure that having multiple users working on data doesn’t result in multiple versions being saved,” says Thompson. “It requires discipline from all team members.”
A prime example of duplication was when a CV was submitted to an organisation for consideration. Often, that CV would be shared between multiple human resource managers and team leaders. “Since the GDPR [General Data Protection Regulation] came in, and we have done a lot more work on data discovery, and the data duplication has been mind-blowing, where data is being duplicated up to fifty times,” says Tankard.
One of the consequences of file duplication is the inadvertent creation of shadow cloud/networks. These are the areas beyond an organisation’s control, where project files can be found.
Rather than remotely connecting to the organisation’s network, employees wishing to work from home or during their commute might save files to their personal cloud storage accounts or on personal devices, before transferring the files back to the network later. Not only are these files outside the protection of the company servers, they are also outside the versioning and audit controls of the organisation’s management systems.
“You have got where you think your data is on your network, but you also have this haze around it where your data is floating around, which is not in your control anymore as an organisation,” says Tankard.
“The collaboration of data is where you suddenly see that all breaking down, because you are not necessarily working on the latest if it has not been put back on the proper network, and you now have another team working on the old file.”
These two elements can both lead to conflicting project information, whereby multiple versions of the same file are being actively used on the same project. “Currently, it’s just too difficult to collaborate digitally across many disciplines, due to technology being hard to use and difficult to integrate,” says Thompson.
While these issues can seem frustrating, they carry with them underlying problems, which highlight data governance issues within a company. These issues can present themselves as delays in projects being started due to employees not yet having access to the project folder, poor efficiency arising from versioning conflicts, and possible breaches of GDPR due to inadvertent long-term storage of personal data.
“Security of data in our sector is critical, as we continue to systematically connect critical infrastructure to the internet and create digital twins of the physical environment,” says Thompson.
“This has a profound impact on security; the risk is no longer around misplaced data on USB sticks and printed plans left on trains. We have to manage every users’ ability to have instant access and control of these assets from any device, at any location on the planet.”
These issues can frequently be put down to three core failings: lack of awareness of an organisation’s data sharing policies, inadequate telecommunications infrastructure or poor active directory management.
Rarely are these issues caused by malicious employees, but rather by employees seeking a workaround in order to meet project deadlines. “If the systems were there, and they could easily access them from wherever, then people would use it,” observes Tankard. “People do not necessarily want to copy stuff everywhere; they want to use the system that is in place.”
Remote access tools
Some of the difficulties due to data duplication can be overcome by using remote access tools. These create an encrypted link between a device such as a laptop (preferably company-owned) and the corporate network, enabling employees to work from home when necessary.
Multifactor authentication (MFA) can be used to protect against unauthorised access. While such tools have historically suffered problems, a suitable remote access tool allows much better control of company data than improvised storage on employees’ own devices.
There are also a variety of collaboration, data management and product lifecycle tools available, such as Teamcenter, WebCenter and SharePoint, which organisations can use to provide file versioning control within their network.
Tools that use a check-in/check-out process for documentation allow employees to ensure they are working on the latest version. Typically, if the file is checked out, other users cannot edit the previous saved version.
“Some of those collaboration tools are live-dynamic tools, where the system will periodically save automatically and show the latest version, so you can get constant updates of changes,” says Tankard. “Sites like GitHub have also become quite popular, as you can almost put your development chain in a central repository.”
Alongside this, organisations can adopt data classification policies, using applications such as Titus’s data protection suite, which allows for the classification and management of data in a network.
These will empower employees to identify critical files for additional protection, as well as highlighting those that can be deleted after a certain period, thereby eliminating unnecessary data retention.
Data assessment and data cleanup applications can be another aspect of an organisation’s data governance policy. These enable organisations to discover, recognise and act upon duplicated data, without moving it to a repository or specialty application. Using such tools to resolve simpler tasks allows network administrators to focus on the more complex issues.
Underpinning any sound data governance policy is ensuring employees are adequately informed on how data should be shared within the organisation. This education should be a complementary approach, with employees being regularly reminded how they should be internally sharing information, as well as having management teams lead by example. Appropriate technical measures can also enforce the data governing policies, such as by blocking unacceptable copying of files.
Organisations can take steps to ensure files cannot be saved elsewhere, by using access control systems, such as XenDesktop by Citrix, by blocking employees from downloading files to personal devices or local hard drives. These force employees to put data in only certain locations.
“Unless you go down those routes, it is very difficult to control the versions and all that you can do is put in place the processes that allow people to access the data easily from home to encourage not to put it on their hard drives and work on it offline,” says Tankard.
A primary element of all data governance will be ensuring that employees link to, rather than share, files. Not only will this mitigate versioning conflicts and make GDPR breaches less likely, it will also reduce an organisation’s required storage capacity.
Two of the most common difficulties encountered with this policy are the unintentional saving of changes to files, or the link’s recipient not having access to the file location. However, these problems can be overcome with access management protocols identifying who can access, save and create data within each project folder.
Ensuring that an adequate telecommunications infrastructure is maintained is critical to support any data governance policy. Such infrastructure forms the backbone of any organisation’s network and can frequently be the underlying cause of many issues around data governance.
“It has to come down to the communications and the systems being available, and this is where the cloud has been a big benefit” says Tankard.
Active Directory permissions management should also form the cornerstone of any data governance policy, as many of the issues regarding information accessibility can be caused by this. Frequently, this is due to active directory management being the sole responsibility of a member of an organisation’s IT team.
Active directory, however, is not the easiest tool to use. As a result, IT teams tend to wait until there is a backlog of access requests to be done using active directory, in order to do it all at once. This leads to a bottleneck in the process.
This can be overcome by several methods, the easiest of which is sharing the responsibilities of active directory management with other members in the IT team, thereby lessening the individual workload.
Alternatively, there are various overlays for active directory that make the system more intuitive and easier to use. Using these overlays, active directory management can, in some cases, be delegated to team leaders.
Doing this lessens the workload on IT teams but also gives more direct control to access rights and accessibility to team leaders, who are naturally more informed about who requires access to projects. “You do not have to understand Microsoft’s spurious ways of managing active directory, as these tools are very intuitive and simple,” says Tankard.
Alongside this are additional Active Directory clean-up functions, such as AD Tidy, which allow IT teams to remove obsolete user accounts and account groups.
Another method for governing accessibility, especially with departing employees, is through using two-factor authentication, whereby access to the company network requires a password and a physical token.
As all equipment belonging to an organisation is returned at the exit interview, this will include the authentication token. Therefore, even if a former employee’s account remains active, they will be unable to access any data.
By bringing data governance under stricter control, through using collaborative tools and rigid policies, organisations will find that not only will their data security be improved, but their efficiency and cost effectiveness will also improve as well.
However, organisations should continue to remain aware of the latest developments in collaborative tools in order to maintain their competitive edge.
“In the future, technology and processes are likely to become more secure, more transparent and more user friendly,” says Thompson.