Navigating Future Prospects: IBM’s TM1 Amidst a Cloud BI Tool Surge

As the business intelligence (BI) horizon continually expands, IBM’s TM1 (now known as IBM Planning Analytics) finds itself amidst a proliferating cluster of cloud-based BI tools. This powerful, multidimensional analysis tool has been vital for organizations seeking robust planning, forecasting, and reporting solutions. With the advent of versatile Cloud BI tools, the landscape appears saturated, but TM1 still holds a unique, invaluable position for businesses.

IBM TM1’s Strengths

IBM TM1’s longstanding reputation is built on its high-speed analytics, reliable data consolidation, and adept planning capabilities. It’s a tool designed for scalability and performance. Its in-memory OLAP (Online Analytical Processing) server ensures swift computations and analyses, allowing businesses to access and disseminate complex data efficiently.

TM1’s user-friendly interface, coupled with Excel integration, offers a familiar environment for users. This seamless integration reduces the learning curve, enabling businesses to leverage the tool’s extensive capabilities without extensive training. TM1’s rule-based calculations further simplify financial planning and forecasting, providing accuracy and consistency in projections.

The Cloud BI Explosion

Cloud BI tools have indeed taken the corporate world by storm, providing unparalleled accessibility, scalability, and collaboration. These tools offer subscription-based models that are attractive to small and medium-sized enterprises due to their cost-effectiveness and ease of implementation. With the cloud’s elasticity, businesses can scale their BI operations smoothly, aligning with fluctuating needs and market demands.

TM1 in the Cloud Era

Despite the cloud surge, IBM TM1 remains relevant and continues evolving to meet the contemporary business’s needs. Its transition to IBM Planning Analytics signifies an embrace of modern features and cloud functionality. The tool now offers a cloud-based version, ensuring users enjoy the benefits associated with cloud computing, such as remote accessibility and reduced reliance on physical hardware.

Planning Analytics incorporates AI-infused planning and forecasting capabilities, providing deeper insights and more accurate projections. Its expanded set of APIs for data integration and connectivity ensures it can work harmoniously with various data sources and third-party applications, making it versatile and adaptable in the crowded BI ecosystem.

Conclusion

While numerous Cloud BI tools are making waves, IBM’s TM1 (Planning Analytics) confidently strides forward, reinforcing its relevance and utility in the ever-evolving BI market. Its blend of traditional strengths and newfound cloud and AI capabilities makes it a formidable contender, providing a reliable, comprehensive solution for businesses requiring both depth and flexibility in their planning and analytics.

Businesses invested in TM1 need not see the cloud BI proliferation as a sign to abandon ship. Instead, they should explore how TM1’s evolution aligns with their growth and needs, understanding that this seasoned tool is not in the dusk of obsolescence but in a dawn of revitalized potential and opportunity in the future of business intelligence.

Unleashing Potential: Azure Synapse at the Forefront

Azure Synapse, Microsoft’s integrated analytics service, empowers businesses with unmatched speed, performance, and security. For UK consultancy firms intent on facilitating data integration, exploration, and analysis for their clients, this robust platform offers invaluable tools, thereby simplifying the process and driving insightful business intelligence.

At its core, Azure Synapse seamlessly combines big data and data warehousing, offering a unified experience that significantly enhances productivity. The platform allows consultancies to access, analyse, and visualise data efficiently, thereby unlocking valuable insights. With its deep integration of Apache Spark and SQL analytics, Azure Synapse meets the demands of various data workloads without compromise on performance.

Consultancies are often at the forefront of assisting organisations in making data-driven decisions. To this effect, Azure Synapse’s limitless analytics service plays a pivotal role. It allows firms to query data on their terms, at scale. With on-demand or provisioned resources, consultancies can manage and analyse large amounts of data effortlessly, providing their clients with essential, timely business insights.

Security and privacy are paramount in the consultancy sector. Azure Synapse ensures data is protected with the most advanced security measures. The platform is compliant with over 50 industry certifications, providing peace of mind to both consultancy firms and their clients alike. Its in-built features like automated threat detection and always-on data encryption guarantee a secure environment for sensitive business data.

For consultancy firms in the UK looking to offer their clients a scalable, secure, and powerful analytics service, Azure Synapse stands out as an exceptional choice. It not only accelerates the time to insight but also fosters an environment where data professionals can collaborate and innovate, pushing the boundaries of what’s possible with data.

In conclusion, for UK consultancy firms, Azure Synapse is more than a data analytics tool; it’s a comprehensive service that supports businesses in navigating through their data journey with ease and security. Its exceptional capabilities in data integration, analytics, and security make it an indispensable asset for consultancies aiming to lead in the competitive, data-driven business landscape.

AWS vs. Azure: Navigating the BI Horizon for SMEs

The business intelligence (BI) landscape for Small and Medium Enterprises (SMEs) is awash with robust solutions, chief among them being Amazon Web Services (AWS) and Microsoft Azure. These giants offer compelling features tailored to bolster BI processes, though they differ subtly in terms of benefits and offerings.

AWS: Unparalleled Scalability & Flexibility

AWS excels in scalability, a critical factor for growing SMEs. As your BI needs expand, AWS’s extensive suite of cloud services can effortlessly scale to match your requirements, providing flexibility without added complexity. This elasticity ensures that SMEs only pay for the resources they use, fostering cost-efficiency while accommodating fluctuating workloads.

AWS also boasts of a diverse array of BI tools and services such as Amazon QuickSight and AWS Glue. QuickSight is a snappy, cloud-powered BI service designed for ease of use, allowing SMEs to create and publish interactive dashboards accessible from various devices. These tools are crafted to seamlessly integrate with each other and with third-party applications, offering a holistic, customizable BI environment.

Azure: Seamless Integration & Security

For SMEs heavily invested in Microsoft products, Azure provides a seamless and intuitive experience. Azure’s BI services, including Azure Synapse Analytics and Power BI, offer deep integration with familiar tools like Excel and a host of other Microsoft applications. This integration fosters a streamlined workflow, allowing businesses to harness the full potential of their existing infrastructure and software investments.

Azure doesn’t just offer integration; it’s steadfast about security. With over 50 compliance certifications, it goes the extra mile to secure your data. SMEs can trust Azure’s robust security framework to protect their sensitive BI data, providing peace of mind amidst the increasing threats in the digital space.

Making The Choice

For SMEs on the BI journey, the decision boils down to specific business needs and existing infrastructure. If your enterprise demands unparalleled scalability and desires a broad set of tools, AWS is a formidable choice. However, for businesses seeking seamless integration with Microsoft products and robust security measures, Azure stands out.

Final Thoughts

Both AWS and Azure bring distinct advantages to the BI table for SMEs. The choice between the two should align with your business’s unique needs, growth expectations, and the environment that best supports your BI objectives. Navigating the decision requires a careful examination of each platform’s offerings, ensuring your SME harnesses the power of a BI solution that not only meets but anticipates and evolves with your business demands.

SSD bad sectors, chkdsk rubbish,.NET Corruption and SFC can’t repair

Being an SSD early(ish) adopter in my work laptop I’ve had my fair share of problems but as everything seemed to have settled down recently and particular after upgrading to Windows 8, I thought I’d go back to encrypting the disk.  I know this comes with a performance hit but using Bitlocker in Win 8 it wasn’t too noticeable and the fact it could encrypt both of the disks in the laptop with a single password at boot it was finally becoming usable.  I mention the encryption because I suspect this is responsible for the amount of corruption that was caused by a sudden blue screen, or should I say far-from-smilie face in Win 8.

Since Windows 7, this laptop has been rock solid, I don’t change drivers frequently but do stay reasonably up to date, I apply windows updates but try to do them on a clean boot and check things have bedded down nicely.  I keep on top of what’s running with fairly brutal use of msconfig and this has served me well for ages but on this particularly morning, mid key press in Word or something similar it just went, frowned at me and rebooted.  After that, it was just an endless series of “unmountable boot volume” messages.

Here some frustration with Win 8 kicks in, I didn’t have a boot disk as I was at work, yet the option to repair has completely gone, even from the screen that offers to boot into safe mode.  Apparently you need to have booted Windows to enable the repair option on boot!  Of all the stupid things to do, come on MS, have a key press like F8 that can boot from a tiny repair image or similar, I don’t need a menu or a significant delay because I’m happy to hammer an F key to get the timing right if I’m in trouble.  Not even on the “sorry it looks like your computer is knackered, try safe mode” menu which would have made sense.

Anyway, once I’d built a boot disk and got into repair mode I did the first check which was a chkdsk (without repairs) on the SSD.  This has happened to me before and although SSDs don’t have sectors, no one seems to have told the OS that and it had multiple bad ones, in fact it was one of those chkdsk funks that suggest a very long wait for it to do not much.  Previous experience trying to repair the disk using chkdsk had led to me almost throwing the SSD away, it will not do the job full stop in my experience, it just can’t get it’s head around it not being a spinning platter.

What I found last time was the only way was to clone the disk to another device (ignoring errors for which I use HDClone pro which is quick and always does the job) then clone it back.  I know this is expensive in terms of write cycles but it seems to be the only way to get the SSD firmware to sort out it’s bad blocks (which aren’t even blocks, sectors or bad in fact).  With that done, I then ran chkdsk with repair as I had a copy anyway and it found a load of errors, patched them up and booted windows, hurray.

Boot into windows and run an SFC /SCANNOW to see what damage has been done and it turns out .NET 2 is broken, a number of dlls corrupted and the store for those dlls also corrupted.  No I’ll digress for one moment more.  Why the hell are dlls getting corrupted MS, they should not be open for write surely, how can they therefore get corrupted.  When they do, how is it that the stored copy in winsxs is also corrupted?  I realise my understanding of the innards of Windows is far from complete (very far I’d imagine) but that just seems crazy to me.

I’ve been here before too, fixing someone else’s laptop that had .NET corruption and corrupt store.  No amount of scanning and interrogating the CBS.log got that one fixed as it was Windows Vista and failed to come back.  Imagine my surprise after a while on a popular search engine!

This Microsoft article says it all, Windows 8 has two simple command line actions, one to diagnose and one to repair that can go online and patch things up.  Personally I think SFC should have done this, and better still, it should have had a user friendly GUI as if I was not a geek I might have thrown this laptop out of the window before finding the solution.

So here’s the commands

DISM.exe /Online /Cleanup-image /Scanhealth

followed by

DISM.exe /Online /Cleanup-image /Restorehealth

All is now well until the next failure!

PS I’ve taken encryption off, I know it’s naughty and I could get sued but it just doesn’t work well with SSDs

Stored Procedure execution plan different to Query – Parameter sniffing

I’ve been an on-again-off-again DBA for over 15 years now but still get caught out by stupid little SQL Server behaviors! Today I was looking into why a perfectly good SQL statement that in testing hits an index perfectly and runs in milliseconds takes many seconds in Production when run as a stored procedure.  The execution plans are completely different between the stored procedure and the raw SQL and it turns out this is just a failing of parameter sniffing.  Parameter sniffing is perfectly normal and required as SQL Server has to decide on an execution plan and tends to use the first parameter submitted to the stored procedure as it’s basis for the query.  It then caches this plan and uses the same access method each time the proc runs.  That’s great if all parameters have similar results and it gets the plan right first time.  Unfortunately in my case it had got it all wrong and wouldn’t give up!

To trick the procedure to emulate the plan used in a simple query through SSMS, I ended up just copying the parameter.  Sounds stupid but works a treat.

Instead of

create sp_dummystoredproc @param1 int
as
select * from table where col1 = @param1

use

create sp_dummystoredproc @param1 int
as
declare @dummyparam1 int = @param1
select * from table where col1 = @param1

It’s not a particularly costly fudge so it’ll do as it solves the problem and makes the query use the correct indexes.

SSIS FTP Task Error: Failed to lock variable

Just encountered a frustrating error that prevented a new SSIS package from compiling and running.  FTP Tasks had stubborn red crosses against them with the cryptic errors.

Error    1    Validation error. Delete C1 CTRL : Failed to lock variable “” for read access with error 0xC0010001 “The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.”.      Blah.dtsx    0    0   
Error    2    Validation error. Delete C1 CTRL : The Validate method on the task failed, and returned error code 0x80131500 (Failed to lock variable “” for read access with error 0xC0010001 “The variable cannot be found. This occurs when an attempt is made to retrieve a variable from the Variables collection on a container during execution of the package, and the variable is not there. The variable name may have changed or the variable is not being created.”.  ). The Validate method must succeed and indicate the result using an “out” parameter.      Blah.dtsx    0    0   
Error    3    Validation error. Delete C1 CTRL : There were errors during task validation.     Blah.dtsx    0    0 What variable, “” doesn’t even exist so why are you trying to lock it!

As always, simple answer and my fault again, I had changed the FTP Task to use variables for source and destination paths, then changed the type to delete remote file.  This left a hidden field that wanted a local path variable namethat wasn’t used but I’d told the task to use variables!

Changed the task back to a send files, changed local variables to false and then changed back to delete remote and all was well again.

So beware hidden variable fields.  I’m guessing this will proabbly apply to most SSIS tasks that access paths such as File System Task etc.

Andy

ASP.NET MVC2 & Areas : Error 404 Resource not found

I’ve started (probably later than I should) messing around with ASP.NET MVC 2 for a client project.  I’d been writing everything very traditionally as an ASP.NET application and it was getting more and more complicated and harder to improve as it’s not my main job but more of a bolt on to what I actually do day to day which is BI and Data Architecture.

As always, I read through a couple of tutorials then decided to have a go.  I’d played with Symfony using PHP a couple of years ago and felt I understood the MVC concept OK in seeing if it was a viable alternative to the messy code I was currently creating.  As is so often the case, I didn’t read the part of the tutorial that covered Areas and started developing.  After a day I started to think about the application structure and realised it had four quite distinct functions and stumbled upon areas.

Again being far too quick off the mark, I created an area and moved some of my controllers into it and funnily enough, they stopped working displaying a 404 Resource Not Found error.

To cut a long story short, after 2 hours on Google, I discovered thanks to this article that I just hadn’t been careful with my namespaces.

So in the moved Controller class files the namespace was still ApplicationName, it should have been ApplicationName.Areas.AreaName obviously replacing application and area names with your own.  Recompile and all is well.