[Skip to content]


Blog Posts from the team at Budgeting Solutions

Separating out the wheat from the chaff can be challenging and sometimes the right facts are hard to find.  Our blogs contain a wealth of information gathered from many years of experience by our consultants dealing with everyday issues in projects on client sites.

They may provide clear insight on particular problems or act as a launching pad to clarify your thoughts on issues with how to deliver robust performance management solutions.

Main Posts

How to reduce the time spent in finance meetings

Posted by James Salmon at 3/12/2018 12:22:15 PM

CFOs know better than anyone that time is money. So why do they waste so much of it in meetings?

More than half of finance teams average nine or more hours each week in meetings, according to a survey of CFOs, and 25% sit through a whopping 13+ hours.

The wrong way to provide financial data

The problem is that because finance teams have access to information and metrics across the organisation, they serve as the gatekeepers to data. When you’re the information gatekeeper, you often become the information sherpa, too. In traditional organisations, this becomes a huge time suck, because finance executives have to guide their colleagues through masses of spreadsheets, addressing the confusion and doubts that proliferate in a disjointed system.

Traditionally, they’ve gathered PowerPoint slides from various departments and shoehorned them into a huge deck to present to the executive team. The slides are invariably a never-before-seen mishmash, with information taken from multiple spreadsheets, that leave attendees questioning data and trying to understand formulas. Every number is explored, and assumption is challenged, which—let’s face it—rarely makes a big difference in the final budget.

As a result of these marathon meetings, CFOs often feel trapped, spending almost all their time looking in the rear-view mirror, which leaves precious little time to focus on the road ahead. And because they waste time that should be devoted to financial strategy discussion, those meetings also waste a stomach-turning amount of money: Some $37 billion goes down the drain every year during unproductive meetings.

Cloud-based finance tools help solve problems

Transitioning to a cloud-based financial tool can immediately help plug this cash haemorrhage. With easily navigable dashboards, cloud-based finance tools provide a single source of truth that eliminates confusion and uncertainty.

When you use a cloud-based finance tool, everybody sees updated numbers ahead of time and can toggle back and forth between an original financial plan and the current version. As a result, everyone comes to the table more prepared. That eliminates the whole “How did you reach this number?” discussion that gobbles up a tremendous amount of time, and quickly elevates meeting conversation from basic explanation to higher-level financial strategy discussion.

Financial Performance Management tools also make it plain to see what was shared at previous meetings, so historical comparisons are quick and painless. As a result, financial planning meetings can be short but incredibly effective.

Together, this transparency and the creation of a single source of truth make non-finance execs far more efficient in understanding the data. This autonomy, in turn, frees up FP&A professionals to contribute higher-level strategy insight. And that’s critical, because the CFO role has transformed: Today’s finance leaders must provide forward-looking vision in addition to accurate financial forecasts. Most CFOs have seen their level of strategic influence increase as they take on concerns beyond number-crunching, an Accenture survey found.

Active planning with cloud-based finance software, then, presents a win for everyone: It’s fast, it’s easy, it’s powerful, and it accelerates decision-making by providing clear, accurate information. That lets CFOs step into their rightful role as strategic advisors—and gets everyone out of that conference room ASAP.

Find out more about how active planning can help you save time and money—and become more strategic in your financial meetings.

Transitioning Cognos BI V10 Skills to Cognos Analytics V11

Posted by James Salmon at 18/12/2017 12:03:10 PM
























Clients have been calling in with questions about the exciting next generation of the IBM Cognos Business Intelligence platform – known as IBM Cognos Analytics, v11. Besides wanting to know what is new, one of the common questions we hear is: Will my Cognos 10.x skills translate to the equivalent roles in v11?

The short answer is yes. Even though v11 presents an entirely new user experience (UX), we have found that for existing users the transition from v10 can be a positive experience. We have created a reference chart about our findings on what skills will translate directly and which will not, followed by detailed notes on our findings and a quick preview of what’s brand new. 


  • Query Studio, Analysis Studio, Event Studio, Metric Studio, and Cognos Workspace are not a part of the v11 UX. However, each of these capabilities and objects created from them will migrate 1-to-1 into v11 and will run in their legacy UI in a separate browser tab or window. Objects created in these tools in v11 will also be saved as legacy objects – currently there is no migration path. IBM has announced end of life for these tools after this release. 

  • If you intend to continue using any of these studios after upgrading to v11, any current v10.2 training will be 100% relevant. Budgeting Solutions recommends devising a strategy for migrating off from these solutions in the long-term. Contact us for more information. 



  • Report Studio and Cognos Workspace Advanced do not exist as-is in v11 UI/UX; however, all of their features and functionality are available in its new authoring piece. Experienced authors will not find migrating from v10.2 to 11 a major challenge. Core techniques learned in a  v10.2 Report Studio or Cognos Workspace Advanced environment will be directly translatable to v11 – the same goes for Dimensional Authoring, OLAP Data Exploration, and Active Report authoring. 
  • Budgeting Solutions will be offering a one-day New Features for Experienced Authors course which will enable a seamless transition for Report Studio and Cognos Workspace Advanced authors. For a limited time we will be offering a complimentary seat in this new course for each student that registers for our existing Report Studio training course. 



The main administration user interface has NOT changed from v10 to v11, nor have the core configuration components. Architecturally this new version is basically the same as v10; however, there are some new installation and configuration options.


  •  The most obvious change is to the user experience. However, consuming static reports does not change. Consuming interactive reports has become much more seamless in the new UX, basically making every report available to be enhanced by users similar to Cognos Workspace.       
  • Graduated capabilities means that users have a singular UI for consumption, enhancement, customisation, self-service authoring, and data discovery. No more hopping from studio to studio. 


So, what about all the NEW stuff?
There are a lot of brand new, never before imagined, features in v11 such as data modules for self-service modeling and a new dashboarding capability. We are in the process of developing training materials to describe these new features. We will be holding a series of free webinars covering many of these new features. 

BOTTOM LINE:  With v11, users will be getting a two-fer: powerful new features along with a thoughtful transition between the legacy and new versions. Core skills developed in v10 will be either directly or easily applicable in v11. Users should continue to upgrade their skills to get the full value of their existing investment and can be confident that their skills will still be valid after upgrade.



Does Your Business Have The Data Agility That It Needs?

Posted by James Salmon at 31/7/2017 2:00:35 PM

Data is the new battleground. For companies, the situation is clear – their future depends on how quickly and efficiently they can turn data into accurate insights. This challenge has put immense pressure on CIOs to not only manage ever-growing data volumes, sources, and types, but to also support more and more data users as well as new and increasingly complex use cases.

Fortunately, CIOs can look for support in their plight from unprecedented levels of technological innovation. New cloud platforms, new databases like Apache Hadoop, and real-time data processing are just some of the modern data capabilities at their disposal. However, innovation is occurring so quickly and changes are so profound that it is impossible for most companies to keep pace, let alone leverage those factors for a competitive advantage.

The challenge CIOs face today is acute, with rapidly advancing platforms and technology, and more sources to connect and users to support than ever before

It’s clear that data infrastructures today can’t be static if they are to keep pace with the data requirements of the business. Today’s competitive environment requires adaptive and scalable infrastructures able to solve today’s challenges and address tomorrow’s needs. After all, the speed with which you process and analyse data may be the difference between winning and losing the next customer. This is significantly more important today than 10 or 15 years ago, since companies used to make a strategic database choice once and keep running it for a decade or two.  Now we see companies updating their data platform choices far more frequently to keep up.

If companies are to thrive in a data-driven economy, they can’t afford to be handcuffed to ‘old’ technologies; they need the flexibility and agility to move at a moment’s notice to the latest market innovations. However, it’s not enough for companies to simply be technology agnostic; they also need to be in a position to re-use data projects, transformations, and routines as they move between platforms and technologies.

How can your company meet the agility imperative? To start, let’s consider the cloud question.

Many clouds and constituencies

In a data-driven enterprise, the needs of everyone – from developers and data analysts to non-technical business users – must be considered when selecting IaaS solutions. For example, application developers who use tools such as Microsoft Visual Studio and .NET will likely have a preference for the integration efficiencies of Microsoft Azure.

Data scientists may want to leverage the Google Cloud Platform for the advanced machine learning capability it supports, while other team members may have a preference for the breadth of the AWS offering.  In a decentralised world where it’s easy to spin up solutions in the cloud, different groups will often make independent decisions that make sense for them. The IT team is then saddled with the task of managing problems in the multi-cloud world they inherited – problems that often grow larger than the initial teams expected.

One way to meet a variety of stakeholders’ needs and embrace the latest technology is to plan a multi-cloud environment by design, creating a modern data architecture that is capable of serving the broadest possible range of users. This approach can safeguard you from vendor lock-in, and far more importantly, ensure you won’t get locked out of leveraging the unique strengths and future innovations of each cloud provider as they continue to evolve at a breakneck pace in the years to come.

Integration approaches for data agility

Once perhaps considered a tactical tool, today the right integration solution is an essential and strategic component of modern data architecture, helping to streamline and maximise data use throughout the business.

Your data integration software choice should not only support data processing “anywhere” (on multi-cloud, on-premise, and hybrid deployments) but also enable you to embrace the latest technology innovations, and the growing range of data use cases and users you need to serve.

Hand coding

I said “data integration software” as I simply don’t believe that modern data architecture can be supported by hand-coded integration alone. While custom code may make sense for targeted, simple projects that don’t require a lot of maintenance, it’s not sustainable for an entire modern data architecture strategy.

Hand coding is simply too time-consuming and expensive, requiring high-paid specialists and high ongoing maintenance costs. Moreover, hand-coded projects are tied to the specific platform they were coded to, and often even a particular version of that platform, which then locks the solution to that vendor and technology snapshot.  In a continually accelerating technology environment, that’s a disastrous strategic choice.  Also, hand coding requires developers to make every change, which limits the organisation’s ability to solve the varied and evolving needs of a widely-distributed group of data consumers.  And finally, it can’t leverage metadata to address security, compliance, and re-use.

Traditional ETL tools

Traditional ETL tools are an improvement over hand-coding, giving you the ability to be platform agnostic, use lower skilled resources and reduce maintenance costs. However, the major drawback with traditional ETL tools is that they require proprietary runtime engines that limit users to the performance, scale, and feature set the engines were initially designed to address.

Almost invariably, they can’t process real-time streaming data, and they can’t leverage the full native processing power and scale of next-generation data platforms, which have enormous amounts of industry-wide investment continually improving their capabilities. After all, it’s not simply about having the flexibility to connect to a range of platforms and technologies – the key is to leverage the best each has to offer. Moreover, proprietary run-time technologies typically require software to be deployed on every node, which dramatically increases deployment and ongoing management complexity.

Importantly, this proprietary software requirement also makes it impossible to take advantage of the spin up and spin down abilities of the cloud, which is critical to realising the cloud’s potential elasticity, agility and cost savings benefits. Traditional ETL tools simply can’t keep up with the pace of business or market innovation and therefore prevent, rather than enable digital business success.

Agile data fabric

What’s required for the digital era is scalable integration software built for modern data environments, users, styles, and workflow – from batch and bulk to IoT data streams and real-time capabilities – in other words, an agile data fabric.

The software should be able to integrate data from the cloud and execute both in the cloud and on-premises.  To serve the increasing business need for greater data agility and adaptability, integration software should be optimised to work natively on all platforms and offer a unified and cohesive set of integration capabilities (i.e. data and application integration, metadata management, governance and data quality).

In a data-driven enterprise, the needs of everyone – from developers and data analysts to non-technical users – must be considered when selecting IaaS solutions.

This will allow organisations to remain platform agnostic, yet be in a position to take full advantage of each platforms’ native capabilities (cloud or otherwise) and data technology. All the work executed for one technology should be easily transferable to the next, providing the organisation with economies of skills and scale.

The other critical capability you should look for in an Agile Data Fabric is self-service data management. Moving from a top-down, centrally controlled data management model to one that is fully distributed is the only way to accelerate and scale organisation-wide trustworthy insight. If data is to inform decisions for your entire organisation, then IT, data analysts and line of business users all have to be active, tightly coordinated participants in data integration, preparation, analytics, and stewardship. Of course, the move to self-service can result in chaos if not accompanied by appropriate controls, so these capabilities need to be tightly coupled with data governance functions that provide controls for empowering decision makers without putting data at risk and undermining compliance.

The challenge CIOs face today is acute – with rapidly advancing platforms and technology, and more sources to connect and users to support than ever before. Meeting these new and ever-evolving data demands requires that companies create a data infrastructure that is agile enough to keep pace with the market and the needs of the organisation. 

Recent posts

Blog Category

Posts by Date