Seven Tips for Managing Multiple Data Streams
This article has been contributed by Alasdair Robertson of CTMS.
Integrating multiple data streams is critical to achieving success with automated IT Asset Management.
The saying “garbage in, garbage out” still rings true, and your data makes the difference between getting the right answer from your system and simply getting any answer.
These are my top tips for best managing those data streams:
- GOALS: Understand what outputs you want from the data sets you are merging. It’s important to identify which operational and management reports, and what additional integrations of merged data to 3rd party systems (financial, service desks) will be needed. You’ll also need to identify what data the users will be updating on a daily basis. There is little point in joining all the company systems together and duplicating a large quantity of data if there is no use for it.
- DATA QUALITY: Get the data quality right. You should be aiming for 95%+. Poor quality data can be information that is either out of date, badly formatted (e.g. text in date fields) or irrelevant to the project. Don’t import data which has low integrity, this will just hamper the system’s evolution and adoption by the organisation.
- PRIMARY KEY: Whether computer name, serial number or another field, make sure have a suitable primary key to join the data sets together. Good quality data and a primary key should allow you to integrate data seamlessly in to the target system. I always add a secondary unique primary key to data being imported from spread sheet data. This can be imported in to a spare field in your target system but it means you can always trace the data back to the source. It also allows you to add any additional data fields without reimporting the entire data set again.
- FEDERATION: You can choose systems that let you federate data. This means that instead of duplicating data between databases the system can look up data from a second system on the fly. The appeal of these systems is that your data is always up to date rather than scheduled (often overnight) batch transfer integrations which are carried out every hour or overnight.
- KEEP IT SIMPLE: Don’t over complicate your system. There are web services based integration systems out there that cost around £50K, for that you get bells and whistles, although it still won’t make you a cup of tea. Whilst these may be technically excellent systems, providing you with multiple exception reports and data quality checks, it may be completely over specified for your requirements. Sometimes a simple SQL extract from a first system and then an import to a second system is sufficient. Keeping integrations as simple as possible minimises costs and potential design flaws and, when the integrators walk away, it should be simple to maintain.
- TEST: Test, test and test again. It’s key to a successful data integration project to plan and then test the system repeatedly with different data if the integration is to be repeated. Getting the bugs out whilst in the test phase saves lots of time later.
- DOCUMENTATION: Make sure you get complete integration documentation and copies of any code or scripts. This may cost you a little extra but is probably worth every penny, especially in maintaining the integration in the future.
This article has been contributed by Alasdair Robertson of CTMS.
Related articles:
- Tags: Alasdair Robertson · CTMS · data integration · data quality · data streams · federation · IT Asset Management
About Martin Thompson
Martin is owner and founder of The ITAM Review, an online resource for worldwide ITAM professionals. The ITAM Review is best known for its weekly newsletter of all the latest industry updates, LISA training platform, Excellence Awards and conferences in UK, USA and Australia.
Martin is also the founder of ITAM Forum, a not-for-profit trade body for the ITAM industry created to raise the profile of the profession and bring an organisational certification to market. On a voluntary basis Martin is a contributor to ISO WG21 which develops the ITAM International Standard ISO/IEC 19770.
He is also the author of the book "Practical ITAM - The essential guide for IT Asset Managers", a book that describes how to get started and make a difference in the field of IT Asset Management. In addition, Martin developed the PITAM training course and certification.
Prior to founding the ITAM Review in 2008 Martin worked for Centennial Software (Ivanti), Silicon Graphics, CA Technologies and Computer 2000 (Tech Data).
When not working, Martin likes to Ski, Hike, Motorbike and spend time with his young family.
Connect with Martin on LinkedIn.
Martin is also the founder of ITAM Forum, a not-for-profit trade body for the ITAM industry created to raise the profile of the profession and bring an organisational certification to market. On a voluntary basis Martin is a contributor to ISO WG21 which develops the ITAM International Standard ISO/IEC 19770.
He is also the author of the book "Practical ITAM - The essential guide for IT Asset Managers", a book that describes how to get started and make a difference in the field of IT Asset Management. In addition, Martin developed the PITAM training course and certification.
Prior to founding the ITAM Review in 2008 Martin worked for Centennial Software (Ivanti), Silicon Graphics, CA Technologies and Computer 2000 (Tech Data).
When not working, Martin likes to Ski, Hike, Motorbike and spend time with his young family.
Connect with Martin on LinkedIn.