The ITAM Review

News, reviews and resources for worldwide ITAM, SAM and Licensing professionals.

How many machines, really?

Lack of visibility is the biggest challenge

“Lack of visibility is the biggest challenge”

The most common challenge we face when starting a new software / IT asset management project is obtaining reliable data on exactly how many assets we’re dealing with.

Lack of visibility is the biggest challenge. In SAM/ITAM “Nirvana”, an organisation would be able to tell you how many machines they have purchased; how many of these are currently operational; how many were retired; where they are; in which environment they’re operating; which software they’re running; and whether or not they are virtualized. Ideally they would also tell you how often these machines are used and, if they’re mobile devices, precisely where.

Sadly, we do not live in this “Nirvana” yet. So instead of delivering savings from day one, SAM consultants invariably spend a large portion of their time at the start of a project tiding up the data before they can add any real value.

So what is the cause of this problem, and what can organisations do to ensure their software and hardware inventory data is more accurate?

When inventory is not enough

The foundation for any successful project is good data from the start. For ITAM, this is reliable inventory or discovery data detailing the organisation’s IT assets and their configuration. But conducting an inventory of IT assets is by no means the end of the story – these records need to be matched against HR files, payroll, suppliers (including leasing companies), office services, active projects etc. in order to get the full picture. Whereas a typical ITAM discovery process will tell you what assets the organisation has, it won’t necessarily match these assets to specific users, locations or projects. And while you might have a machine mapped to a user, you need HR records to know if that user still works at the organisation.

Typically these additional, non-IT records are not in the best shape at the start of a project either. It is prudent to set aside the first month of an engagement for data cleansing and reconciliation in order to determine who and what is really on the network, and where.

Now where did I put those 700 laptops?

It is worth highlighting one area of the data audit that is particularly challenging for many organisations – the tracking of leased assets. Without good record keeping of where the assets are and how long their lease terms are, it is very easy for firms to lose track of these assets. This is especially true if they’re mobile devices. A typical scenario could look like this:

The company’s procurement records say that the organisation owns 4,000 machines, yet the ITAM consultant can see that the inventory has only picked up 3,300. Where are the missing 700? There are many possible answers to this; they may already have been handed back to the leasing company, decommissioned, moved to a different project or location, or even stolen! Or worse still, they’re still within the company but the inventory didn’t pick them up.

Fundamentally, no licence optimisation or compliance project can begin without knowing which of these answers is correct. No project can begin on assumptions, and it is the role of the client to present you with the cleanest data possible. If they cannot do this then they should be fully aware of its impact on the project’s ability to deliver. Bad data slows down projects and drives up costs unnecessarily for the customer. While I certainly don’t mind cleaning up the data myself, it is not the most efficient use of a skilled consultant’s time.

Don’t believe everything you read: Implementing tools doesn’t guarantee data quality!

Naturally it is safe to assume that proactive management and regular maintenance of ITAM-relevant data is the best course of action to avoid unnecessary costs and delays at the start of a project. However a systems management tool does not necessarily guarantee good data either. If the tool is ill-configured and its results not routinely investigated then cracks will soon form under the surface. The time of an audit is not the time to discover that your systems management tool is not living up to the promises made at the time of purchase.

The disconnect: The root cause of poor assets data

Systems management aside, tools themselves are not the root cause of the lack of transparency over active assets at the start of a project. The issue of data quality is more fundamental than that, and can be linked to a disconnect between HR, infrastructure and procurement; not an uncommon scenario in most large organisations. HR is concerned with employee productivity – they ensure that employees have the resources they need to do their job. They are generally not concerned with tracking these resources once they have been deployed.

Infrastructure on the other hand responds to the needs of HR and the broader business to ensure the necessary resources are available to the organisation, and procurement takes its orders from HR and infrastructure and focuses on getting the best value from the investments it is asked to make. While each department has its own priorities within the organisation, together they share the same goal when it comes to ITAM – ensuring that the organisation has the resources it needs at the right time and at the best price. If these departments worked together on ITAM they would have a vested interest to manage their assets more closely and as a result, compile more accurate (and more regularly updated) data on active assets.

Working together in the pursuit of ITAM “Nirvana”

Too many organisations only look at their inventory and related data when there’s an event that forces them to do it, such as an audit, merger/acquisition or a licence renewal/true-up. This is not the best time to discover that you’re missing 700 machines. By working together towards the common ITAM goal; HR, infrastructure and procurement will be motivated to keep their organisation’s data up-to-date and accurate. This will help them to save money on their IT and software assets in the long-term, better prepare them in the event they are ever audited or need to return leased assets early, and ensure they get the maximum value from the time their ITAM consultants’ spend with them.

Image credit

About Filipa Preston

Founder of Software Optimisation Services, a global IT and Software Asset Management (SAM) consulting practice.

Filipa is currently the only SAM professional outside of Europe to have been independently assessed and verified by KPMG for her professional and thorough approach to software licence compliance projects. She is also one of only two IAITAM certified CITAM professionals in Australia. In addition she is a Microsoft Certified Professional (MCP 2.0) and IBSMA ISO 19770-1 PCSAM professional.

4 Comments

  1. Your points on inventory data are well made especially when the tools are ill-configured. It is common for AD and exchange records to be hopelessly out of date for large organizations.

    On the other side, IT and procurement processes can easily get out of sync.

    The best approach I have found is cross-verification between systems, frequent inventory of the IT estate and reconcile the difference. This is ultimately achieved, as you have concluded through the various teams working together.

  2. Steve O'Halloran (AssetLabs) says:

    I concur with Piaras; it’s in your best interest to blend!

    We cross-verify by taking the best of agented & non-agented tools; ‘agented’ (such as SCCM) to collect data from devices (laptop) that might not be currently avaiable…. and non-agented (MAP) to capture the devices in which an agent was never installed ( new devices, production servers)

    Each tools has it’s misgivings & errors, but a blend approach gives you the best of both worlds… and gives you a richer insight.

  3. Cary King says:

    “The issue of data quality is more fundamental than that, and can be linked to a disconnect between HR, infrastructure and procurement; not an uncommon scenario in most large organisations.”

    More fundamental than that, really, is the lack of fundamental check-in/check-out management control of the assets in which the organization invests.

    “A fool with a tool is still a fool.” – Grady Booch, founder Rational
    Software

    At the core of Asset Management and Configuration Management is the basic requirement for management controls and basic record keeping. Discovery tools should be used to verify management controls and update where users make changes without IT.

    ADM Hyman Rickover wrote, “Unless you can point the finger at the man who is responsible when something goes wrong then you never had anyone really responsible.””

    Every task of every change should be planned, organized, delegated, controlled, verified and measured.
    Every installation of software creates a constructive liability. Every copy of a VM platform containing lots of software creates a fairly large constructive liability – perhaps as much as $30k to $50k. Post facto SAM, SAM without prior restraint controls, is a never-ending game of catch-up. Where else in your organization would you let a technician create a corporate liability of $30k to $50k without approvals, controls, etc.?

  4. Michael Racz says:

    As an inventory specialist (yes we do exist) we deal with millions of asset data records annually and it is common to discover our client’s accuracy rate to be below 45%.
    Additionally, our clients are normally off by more than 20% on the number of estimated It hardware assets in the environment. Extrapolate these numbers across large scale IT environments and the task becomes enormous.

    If you are an ITAM practitioner, or IT professional, you understand that it is all about the blocking and tackling, e.g. “the fundamentals” of tracking assets. Our most efficient clients are the clients who still do it the old fashioned way, physical
    inventory, cycle inventory and audits, all complimented by the latest software discovery or data management strategy.

Leave a Comment