Monday 30 July 2018

Example of Engagement and Deferred Ownership in BI


This comes out a day late, so my apologies to everyone. As promised, today I will be discussing how I put the AGILE strategy and the Deferred Ownership model to work in a real-life scenario.



Due to the confidentiality of the data I work with, this example may be a bit vague on the details, but I hope that you will forgive me this as we go through it.



The background:

         

I began work at my organization in July of 2017. The position of Business Intelligence was net-new for the organization, and the department had some ideas of what they wanted, but no real direction to achieve it. We narrowed it down to begin with a single piece of automated reporting for one of the largest departments and to develop it as a proof of concept.

Given that I was allowed autonomy in developing the report, the data model and the infrastructure of the BI program, I had the leeway to try out this model of mine.



          I reached out to the manager of the business user group and set up a one to one meeting to discuss their needs, and the challenges that occurred before my arrival. I came with an idea of what they would want, based on research of the operational reporting needs of my industry (a must), and determined what the core elements they needed were, as well as what the extended elements, or ‘nice to have’s’ they wanted to report on, and what would be value added elements that they hadn’t necessarily thought of reporting on previously.

I took this away from the meeting, and immediately pulled out my BI tool; in this case Power BI, because Power BI is the best. (Seriously, the best) I created a design for the report they wanted in both an interactive online, and a print layout version, loading up fake values and selecting visualizations that best displayed the information they wanted to convey. Gauges, bar graphs, grouped columns, and a couple card values. This took maybe two hours. I didn’t bother to pay attention to font selection yet, or colour theming.

          I set up a meeting the next day with the manager and team leads and brought the mock up for a show-and-elicit-feedback. By doing so, I brought the business user past the planning phase and directly into the design phase! I was able to get immediate feedback on fonts, layouts, colours, visual selections, whitespace, and everything else about what their product would be. They ended up wanting a drill-though pie chart (Ugh!) but it allowed them to design with something as a baseline. Something they could see and feel and interact with.

          Moving on to building was mostly about pulling the data, cleaning it up, and building the model for reporting; the data mini-mart (future blog post) that provided the back-end functionality. I, and one other member of the team, spent about a week building up that data model based on all the information gathered from the first two meetings, and assumptions based on the data profiles completed in the ETL process.

          A week later, we launched to a limited user group consisting of the manager, and primary reporting clerk for the department. We gave it to them with ZERO instruction or training, along with the caveat that they were to use it, play with it and do everything in their power to break it. With that release, we asked them to set up an ongoing weekly, twenty-minute touchpoint to discuss their feedback.

          We met every week, the first week was met with a data validation concern with one element, so we immediately removed it from the report, and began to investigate the data model. The second week they were finding the navigation of the drill down a bit confusing, so we changed the layout a bit to highlight it, and they were happy with the result. The next week, we returned the newly validated data to the report and added some whitespace. The next week, we added a click through definitions page, and then changed some title fonts.

          At this point, they were happy. They had something that was theirs, that they designed, that they helped build and validate, and something that the business user believed in. I will say that again, they truly believed in this product.

          The final steps for us in the BI group, was to have a new AD group set up, administration set to the business manager we were working with, and migrate everything to production. Less than a day of work.



          Over the course of the last year, the business manager has gone on to champion the report to executive, including interactive demonstrations, and has increased the user group to the entire management staff of the organization, cementing the report and the attached definitions as the source of truth, all without any additional intervention or prodding from me or my team. I found out about the executive demo, only after receiving a piece of positive feedback from a member of the executive team in my inbox.



          I built it, I helped design it, and I launched the report; but I never owned it. As long as I never owned it, I didn’t take any changes personally, I didn’t have to fight anyone to my vision, I just had to provide support, and hold some hands along the way. Exactly what I believe a solid BI program should be about.



Cheers!



SQL Doch






Sunday 22 July 2018

Deferred Ownership for Enhanced Engagement



In the fall of 2017 I had the joy and honour of hearing Bob Tipton speak on engagement in a data science conference. He came in as the odd man out, and fit perfectly within the group of speakers; nearly every speaker and every attendee was struggling with this concept. How do we bridge the gap between the business users and the technical users in our organizations? I am very much oversimplifying here, but he spoke about setting the bar, not to getting “buy-in”, but instead aiming for “ownership” from your business users! 



I am not sure if I have a definitive answer, but I can do my best to explain the model I have been using in leading engagement within my organization. Deferred ownership!



Regardless of the nature of the initiative we work with using our DataOps model, our ultimate goal with each and every one, is to defer the ownership of the product to the business user. By doing so, we are able to keep them engaged throughout the development process, and at the end of that process, we transfer all ownership and administrastion directly to the user or client. Once they have ownership, they have continued to champion that product throughout the organization without any of my team having to be directly involved. In doing so, they then build up increased organizational awareness of the program I lead, and increase demand for more.



I am going to bring up a major example of how this was accomplished. We began in the early phases of launching business intelligence with a simple goal: to provide a dashboard to the college registrar’s office, on total headcounts and demographics. In meeting this challenge, we apporached it thusly:



1.       Captured business requirements in a single 1 hour meeting, discussing the elements to be reported, the source systems, and the basic reporting layout questions.

2.       Designed the data extracts and integrations into our staging area, utilizing a landing zone for raw data, and a clean zone for transformed data. Engaged again  with a 1 hour meeting with the business user on the clean data tables, as well as the Business Analyst for the area who had been providing direct from transactional system reporting, that was previously used.

3.       Created a prototype of the reporting object, and launched it into development. At this point, we handed this prototype off to the business owner, and a maximum of 2 other delegates from the business area, to use. We provided zero training, with the exception of indicating to the user that they use the report and do everything they could think of to break it, mess it up, or otherwise misinterpret any of the report objects. Simultaneaously, we scheduled a 15 minute weekly follow up with the business owner.

4.       We met for 15 minutes with the business owner once per week to discuss their feedback on both data element validations, and layout and intuitive functionality. Each data validation that needed to be addressed was removed from the prototype until it could be validated further and then reinserted, and any areas of ambiguity were addressed with more easily functioning and usable reporting visuals and interactions. Over the course of 3 weeks, from the initial prototype to launch, we were able to address all the elements of validation and abiguity.

5.       Once the user was happy with the data, the layouts and the functionality, we created a new active directory group for the report, transferred adminstration of that group to the owner, and officially handed the report over to the owner with the instruction sheet on adding and removing people, and the noted that they owned that report now, and we would continue to support it, but not claim ownership of it.



The outcomes of this exercise did, and have continued to exceed my expectations. By having ownership of the report, and having been involved in the design and validaiton of every element, we were able to move the business user beyond buying into the product, and instead, they have taken it, and championed it as the source of truth for the organization.



In the months since this was originally launched, the user group has expanded from a small singular department of users, to being an organizational level report, through the championship effort of the business user, multiple demonstrations that I and my team did not have to be involved in (saving us hours of time and effort) and we have increased the demand for our program to expand and receive additional resources to accommodate the organization.



This type of model flies directly in the face of the standard BI models, where our department would zealously hold onto control of each of the objects, and we would build complex reporting to reach our business users needs, and what we percieve them to be. We would then be required to train and socialize each piece within the organization, and build up the demand through elevator pitches, and multiple conversations both in person and documented.



All of our reporting models have continued along the same vein, and as such, our program has been able to grow internally with minimal effort from our (very) small team.



Our next big step is going to be enacting governance and mastering through similar means, and the greatest success I have seen so far, is not that people are willing to work with us on this, but the fact that some areas, upon seeing how their reporting looks, have actually come and demanded that we support them in enacting governance, and master data management on their behalf.



As always, I am open to questions, feedback and to hear how things are working in your own organizations!



Cheers!



SQL Doch

Sunday 15 July 2018

Strategy in AGILE/DevOps


Thus, begins the new blog. I am unsure how to go about introducing myself best or how best to start things off; I suppose it is best to dive right in with what this blog will be about.

The purpose of this blog is to share what I have learned, in the areas of database admin, data management, business intelligence, and DevOps/AGILE. I will be discussing my opinion and experience in these matters, and what I am continuing to learn along the way. I have spent my career as a generalist, and I make no claims to be exceptional in any of these areas, so if there is something I discuss that can be done in a better, more efficient, or easier way, I am very open to feedback.
I currently work as a business intelligence lead for a community college and have been responsible for enacting business intelligence from the ground up. This has led to some interesting challenges, as well as some interesting opportunities, along the way. The primary opportunity is that I have been given the autonomy to enact the program with the AGILE/DevOps methodologies.

Which will bring me, I think, to the first piece I would like to discuss. Strategy design and management in AGILE and DevOps. This stems from a pet peeve of mine; too many times in my career I have heard that people are against agile, or anything based in agile, because the times they have seen it used, it was an excuse to not plan, to not strategize, and to not document properly. I need to make one thing clear: if you are not strategizing your AGILE initiatives, you have set yourself up for failure.

So how to do it then? Are we able to plan out and strategize the same way we do in other methods? Not exactly, but that doesn’t mean we don’t do it.  In a traditional waterfall style strategy build, we sit down and determine where we are, where we want to go, the steps required to get there, and map those steps out with milestones to measure our progress towards completion. This is a tried and true method, and it does work when done correctly. My continued concern with this method though, is that the strategy is based on providing value from the project only upon completion. During the time that the strategy is being followed, the milestones hit, and we stop to pat ourselves on the back, we are not providing value, only consuming resources towards the promise of value.

When we have changes along the way; a change in leadership, direction, market, user group, or expectations, then we are not able to account for these changes with the waterfall strategy. We require a large amount of time and effort to account for these changes and pivot the strategy towards the new goals. We also remain unable to provide value until we complete the project. These sorts of delayed gratifications may work in some industries. When we are providing insight through intelligence, analytics, and data management however, the key is to provide value sooner rather than later.  To provide the tools, the expertise, and the insights to enact appropriate changes, and to guide our users with the best possible data we can.

We strategize this through similar methods to waterfall.

  • 1.      Current State: We always begin a strategy with examining the current state. We need to know what tools we have, what is the current state of our data landscape, what are our gaps, and where do we need to improve. Part of this is also about finding out where we are in providing value to our users.
  • 2.       Future State: We need to know where we are going. This is the big picture outcome, that defines the direction of our strategy. In order to properly strategize, we need to understand what the future state will look like, whether it be through enacting self-service, completing enterprise ERD, or launching a centralized data warehouse. This future state is critical for defining the path to success.


This is where we will diverge from standard waterfall strategy mapping. Once we have the future state identified, instead of planning out all the steps to get there in detail, engage with all the stakeholders at once, identify risks and mitigations, and documenting everything ad nausea, we are going to take a slightly different route.

  • I. First, we will break down the future state and identify all the parts of it that are divergent from the current state. We identify dependencies and determine the importance of each piece of the whole future state. We identify the areas that can be brought forward as value to the end users and lay these tactical initiatives out as a trail of breadcrumbs for the overall project to follow.
  • II. We document this trail and identify the areas of value that are brought to the organization and the end users, and use these value-adds as the milestones to measure our progress.
  • III. We begin the first of the tactical initiatives, launching into either an AGILE cycle or a DataOps/DevOps cycle to achieve success. Once we begin work in earnest we immediately engage with the clients or stakeholders and get them involved in this process from the ground.
I will be writing more on this topic on future posts, so I hope you will stay tuned for this. Next week we will be diving more in depth into the engagement model on the AGILE and DevOps cycles, as I have been involved in enacting them.

Cheers!

SQL Doch

Reflections from the Summit

This past week I attended PASS Summit 2019 in Seattle. I had an amazing time and it was great to catch up with friends and colleagues, b...