Areas of these documents below have been purposely blanked out due to data sensitivity

Recovery KPI Case Study

Ask – Introduction

Many departments in our warehouse are tracking their KPI’s and our department shouldn’t be any different. The big think was that we would be going through a huge change requiring a huge shift in the way we are required to do our work after we move warehouses. Which means that we also wanted our KPI’s to be future proof. 

After some talking, we all realised that we should design a dashboard to keep live updates for us, that can show us what is happening with the KPI. Throughout Ikea, PowerBI is used a lot and so, the logical route was to use Power BI to create the dashboard. But a Dashboard showing what data? 

There has been a lot of brainstorming and after some time, as a group, we have come up with KPI’s and data that will be useful to have. The questions that came from that are as follows:

  • Repack KPI, are we meeting our targets with the amount we repack?
  • Registered damaged articles into Recovery. Is the warehouse keeping under the target?
  • How often are mistakes made when people are registering damaged articles? How many articles come into Recovery that are not registered? How often are articles registered but not coming to Recovery?
  • How many Quality Salestops have we processed in the past month?
  • How many Quality Investigations have we processed in the past month?
  • How many Quality Actions have we processed in the past month?
  • How much AS-IS do we send per week?
  • How much Scrap do we throw out per week?
  • How many mistakes are made when registering an article as repack when it is not?
  • How many orphans have been found in the past week?
  • How many parts from Spartacus went to Repack in the last week?
  • How many parts were placed into the Spartacus library in the past week?
  • How many spartacus parts were sent to the stores in the past week?
  • How many spartacus parts were delivered to the customers in the past week?
  • How many Spartacus parts got damaged in the past week?
  • Are we running low on Spartacus stock?

Prepare

Preparing this data has been quite straightforward but difficult at the same time. Astro and Spartacus are not able to produce live updates of the data stored in our files. I found this out after I spoke to the Spartacus Project Manager in the UK who is on the team that developed Spartacus. They have tried to interact with the database before but it ended up crashing the application. Astro is a complex system and it is possible to export the data using a SQL query within the application, then sending emails containing the data in Opdoc and using PowerAutomate to automatically update the data used by PowerBI. For now though, we will update the data manually.  The quality reportlist that tracks sales stops and investigations, is an excel file that can easily be connected to and updated since it is stored in sharepoint. The same goes for all the articles we register into Recovery since it uses Sharepoint Lists database, we can easily access it and have it updated with one click in PowerBI.

So all in all, I connected the following data to PowerBI –

  • Quality Reportlist Excel Sheet – containing information on all Salestops and Investigations going through our unit
  • Damages 068 Sharepoint List – Containing all the data from an app we use to register the movement of an article through our department
  • Spartacus Stockmovements dataset – this was downloaded as a report from the Spartacus site which tells me the stock that has been added, the stock taken away and where it went to. There is no way of being able to automatically pull this data. 
  • The Spartacus ComponentReplanishment – I’m able to use this data to understand what is going into Spartacus.
  • The Spartacus Component_Inventory – This contained all the data I needed to understand more details on the part being used. Including which articles it came from, including article names and article numbers.
  • hfudhfsufhushfe report from Astro – lists all the times when articles have been registered as damaged
  • sfhdhsghshgjkfsreport from Astro – This lists all the articles that have been repacked
  • Damaged locations from Astro – This lists all the articles that were damaged and the previous locations where they were damaged
  • Trans Types – I needed data on the transaction type het and found I needed to use a new method to pull that data. But as a happy accident, I managed to find that I was able to pull all the data pertaining to damages and repack. So I was able to get rid of the other datasets from Astro and combine everything into this one almighty one. I of course had to do the usual cleanup on this and you can find all the details of this in the changelog.

Process

It has been difficult to be able to ensure the data I have been using has been coming up without errors. You can see in the changelog that I spent a long time attempting to get rid of errors when different regions were being used and some value types were not accepted when I changed the value types to their correct types. However, eventually I managed to get it so PowerBI accepted the data.

As for cleaning the data, the majority of the data was automatically pulled from logistics systems. Except for the Reportlist, which I personally clean regularly. I am often physically looking at each row and looking for anomalies that don’t accurately represent the physical stock. 

Since the majority of this data deals with stock and the data is raw, there has been no reason to check it for bias. We are not attempting to predict the future, only presenting everything as it is. Once PowerBI finds something strange with the data, it can be investigated by anyone who works in Recovery to find out what is going on with that particular article.

Over time, I have found that I didn’t have the specific columns I was needing to be able to build the visual I was wanting in PowerBI. So I created new columns using a complicated DAX formula to pull out article names from descriptions, figure out the week numbers, figure out whether something was registered correctly or not by cross checking databases and figure out users from new ID’s that were used by the database to protect identities.

Analyse

The whole department is able to look at the numbers and be able to assess whether something is good or not. After researching our past performance on both the amount of damages and the amount of repack we have done. We have established KPI’s that represent a goal to achieve each week. I was able to find that I could put these lines in by including it through the “add further analysis to your visual” button.

In the analysis we have discovered a huge issue with decision’s being changed after the fate of an article had been decided and the amount of problems we have with articles being incorrectly registered. We know we need to put more focus on finding out what the issues are with this. In the future, I could expand these visuals to look specifically into which articles have these issues and how they are being handled. 

The quality information is general info that can be used by anyone who is interested in our workload. 

As for the decision section, the information displayed can be used to point someone in the right direction to investigate why a certain decision is so high. For example, if many articles go for scrap, why is that?

The Spartacus data can be used to assess which articles we should focus on getting spare parts for, whether we are running low on parts and what reason’s are the biggest for using the parts.

Share

When I shared this data and presented what is going on, there was an interest in furthering this project into prediction of the future. Forecasting our workload in the future. After seeing all the DAX codes that can be created, I can confidently say that this would be possible but can take some time. This will be the next project and an expansion of this. Adding another page to the already existing dashboard.

This will be consistently shared on a screen for all recovery co-workers and management can use it to assess the work coming from Recovery and highlight problem areas that may be needing to be looked into.

Act – Conclusion

At the beginning I had many questions that needed answering and over the course of a month, on top of my other tasks, I had been developing, expanding and including others in the development of this project. It is a project for our entire team since it will help us discover key areas where we need to focus on and keep us updated on a daily basis.

At the beginning a lot of questions were asked. Now we have the visuals and the data to answer them. So let’s see if we can answer them…

  • Repack KPI, are we meeting our targets with the amount we repack?

We have set our target and can now easily see when we are achieving it or not on a weekly basis, looking back over the past three months. This data isn’t very useful work wise, since we are already processing all the repack we can possibly do. However it makes it easy to present to people from the outside our performance. 

  • Registered damaged articles into Recovery. Is the warehouse keeping under the target?

We have set the target and now can easily see whether the warehouse is achieving the goal or not. This can be useful for people outside of recovery understanding if we are damaging less articles or not. Recovery could help with that by investigating which articles are getting damaged and why.

  • How often are mistakes made when people are registering damaged articles? How many articles come into Recovery that are not registered? How often are articles registered but not coming to Recovery?

We have this data showing us if anyone is making mistakes when registering articles on both ends. Are we registering incoming correctly and are the others registering correctly? We can use this to figure out how to reduce the number. We all know it should always be 0.

  • How many Quality Salestops have we processed in the past month?

It was easy to create this visual and also created a clustered column chart showing month by month the cases we have done. 

  • How many Quality Investigations have we processed in the past month?

This has been the same as the Salestops.

  • How many Quality Actions have we processed in the past month?

This has been the same as the salestops and investigations.

  • How much AS-IS do we send per week?

It was a challenge to create week numbers. Power BI doesn’t have a native way to track data with week numbers. So I had to create DAX formulas. This will be useful for us to know how much is being sent to the stores whenever the stores complain.

  • How much Scrap do we throw out per week?

This can be good to know so we understand the loss of profit that is coming from scrap. We all do not like scrap and want to find creative ways to reduce it. As we implement different schemes that can achieve this goal, we can keep an eye on our scrap level.

  • How many mistakes are made when registering an article as repack when it is not?

This is important since these mistakes can cost time. If it is sent to repack and it is found that it is either scrap or AS-IS, then the person repacking needs to spend time dragging a heavy article to the pallet of scrap. It’s not only time consuming, but back breaking as well, since the setup is tight around the repack table.

  • How many orphans have been found in the past week?

This is linked to the Spartacus Data (spare parts). Unfortunately, we do not collect this data at the moment but will in the new store. However, we do have this data for the largest store in Denmark. So we are able to create a sample of this data being visualised. 

  • How many parts from Spartacus went to Repack in the last week?

This is also data from the largest store in Denmark. I got this data directly from Spartacus. Unfortunately, Spartacus has a way of splitting the data into many spreadsheets, so I had to create relationships and create new columns containing combined data. The data shown isn’t just for the past week, but for all the previous weeks as well.

  • How many parts were placed into the Spartacus library in the past week?

This is also Spartacus Data that can easily be shown in a visual. With the relationships already formed, it was easy to create the visual. This was treated the same as the previous question.

  • How many spartacus parts were sent to the stores in the past week?

This was treated the same as all the other questions pertaining to Spartacus

  • How many spartacus parts were delivered to the customers in the past week?

This was treated the same as all the other questions pertaining to Spartacus

  • How many Spartacus parts got damaged in the past week?

This was treated the same as all the other questions pertaining to Spartacus

  • Are we running low on Spartacus stock?

This was a little difficult since we do not have the data to create targets yet. We require data to be collected when we have started to understand how we use those parts, or how we wish to use the parts. However, I did create a sample using a line and stacked column chart.

To conclude, this was a very fun project which required me to work on it in the spare time I had between my day to day tasks. However, I took on the challenge and succeeded. I now have quite extensive knowledge of PowerBI and have a Dashboard ready for the new warehouse we will be moving to. I will continue to use PowerBI in the future and expand my knowledge in it. I have published it to the online version of PowerBI and found it didn’t work so well due to the data not being updated. I had to go back and change the data from linking directly to an excel sheet through my computer to connecting to a sharepoint folder instead. However, After a bit of time importing the “new” data and rewriting old DAX formulas, I finally managed to make a dashboard I can be proud of. Next, I will be working with my co-workers to figure out the best way we can forecast the work we would have throughout the day.



Recovery KPI PowerBI Dashboard

Areas of these documents below have been purposely blanked out due to data sensitivity



Areas of these documents below have been purposely blanked out due to data sensitivity

Power BI Changelog

23/06/2022

  • Added Datasets Q-tracker and Damages 068
  • Changed values in document
  • Found there was an issue with dates being correctly processed. Realise every date is of a different format and would have to be cleaned in excel. Left the process until later
  • Removed Top rows of the Q Tracker Dataset since there was a lot of header data
  • Had to remove columns from Q tracker since they didn’t have any data

27/06/2022

  • Added datasets hejsjtusiklsis, Damaged Locations, hufdugishfusuh and Spartacus Stock Movements
  • Transferred datasets from Astro and Spartacus in a text file and CSV file to ensure maximum speed in loading.
  • hjfdshkfhjdskahkfhe dataset imported incorrectly, I reloaded the dataset setting the delimiter as a tab. This solved the issue and it was imported correctly.
  • Corrected all the data types
  • Had issues with the dates being detected incorrectly and causing errors. Used the trim function to eliminate spaces before and after the dates, then continued to split the column to separate time, then separate the numbers. After a long while I realised that the date format in the program may be set to Swedish. So through altering the number columns putting the year first, then month and then day, I found the dates were processed correctly and did not bring up an error when I changed the data type.
  • The Spartacus dataset had a second table within the table at the bottom of the data. I edited this by using the Remove Columns function to take out the bottom six columns. This got rid of the extra table and left us with only the raw data
  • Tried to remove blank rows in Q tracker, took away large amount of data so undid the action
  • Redid all the actions for Q tracker and recreated it. Before imported the dataset again, I created a new column in the reportlist to show the date code for the start date column. I know that works from previous experience. Then I hid the column, closed excel and then imported the data again. It worked, now I have the dates registered as date values. Then deleted the old Q Tracker dataset.
  • Errors came out of Q Tracker when I tried to import. I recognised that every cell is filled when it came to date, so I was able to use the date column to remove rows with blank cells in.
  • There was one row at the bottom of the table that was claiming there is an error because the cells display #VALUE. I checked the Reportlist and couldn’t find any issues so will just take out the bottom 1 row. This may cause issues in the future. Will have to investigate. Okay, that didn’t work. Need to investigate further another day. I have removed that step from Power Query
  • Okay, for some reason PowerBI imported 16000 rows from Qtracker meaning there are a lot of blank rows above the actual data.
  • Went back into Power Query and change the data type in Damages 068 dataset on the dates to show date time. It was an easy and successful change.
  • Continued to change column names and data types in Damages 068 dataset
  • Removed columns in the Damages 068 dataset I considered containing useless data
  • Datasets Qtracker and Damages 068 are continuing to show errors and refusing to load, perhaps in the future I can change datatypes as and when I need them

28/06/2022

  • Edited the Damages 068 document to ensure that it includes the version number of the data row. I accidentally deleted that column earlier
  • Keep getting a strange error message when trying to load the data into PowerBI from Power Query. Researching the issue and trying to solve.
  • Deleted the Reportlist data and the damages 068 data. Went back, imported Damages 068 and ensured I kept the code clean and tidy. I didn’t change many data types and only changed two columns to date time and one column to a whole number
  • Reimported Qtracker
  • First deleted all the automatic steps and then manually put in the name of each column since the header were on row 3
  • Then used the Remove Rows tool to take out the top 3 rows
  • Then deleted column 31 to Column 61 due to it containing nulls
  • Then used the remove rows option again, this time to select the Article Number column and remove any rows that contain null in this column
  • Now we have the row with the #VALUE in the cells at the bottom. Tried to use remove rows tool again this time to remove errors but it still didn’t work
  • Went into excel and created another sheet in Qtracker reportlist. I created a very simple formula to reference all the values in Qtracker and started from the third row down. The data came in very clean, except I have a lot of 0’s where there was no data and I want them to register as nulls. So I will go back to put an IF clause into the sheet to keep cells blank when they read no values.
  • Success, I was forced to create another sheet due to another error in excel where it didn’t execute my formula. However when I put the IF clause in to return no values, then imported the data, after removed blanks, it worked. 
  • I locked the new sheet in excel containing the raw data to prevent it from being touched
  • I forgot to add the date code to the new sheet. However, as soon as I did so, I received the error row I had in the original sheet. The error seems to be attached to the date codes
  • I figured out the previous problem where we couldn’t display the dates correctly by changing the locale. I changed it to the UK and it worked on the original column. I can delete the date codes column and use the original sheet
  • Okay, I reimported the table again. Went through the same steps again, deleting the automatic steps, renaming the columns manually, deleting a lot of unnecessary and empty columns, removed the top three rows and took out any columns with nulls using the article number column as reference.
  • I changed the date type to local although the data is quite messy in the report list, so could not change any other data type into their specific types. I will only change it if I need to for the charts in the future.
    Oh my word, I didn’t get a single error when I closed and applied the data in Power Query to import into PowerBI. The only data I’m missing now is with customer feedback, however, I am now ready to make the dashboard.

04/07/2022 – 06/07/2022

  • Spent time on editing the charts to show the data that is necessary with the smallest amount of charts and making them clear for the user to read
  • Created a table for targets to be able to apply to the graphs as a line and show whether we have met targets or not
  • Attempted to create a DAX formula to calculate whether or not a damaged article we have received was registered or not before it arrived to us
  • Created a new column to many tables with a simple DAX formula showing the week number to be able to apply it to the graphs

07/07/2022

  • Attempted to try and use the CALCULATE, SUM, FILTER and all of those to be able to make the calculation between two tables to discover whether something had been registered before. Decided to create a JOIN instead.
  • Created a relationship between Damages 068 and dshfjkdhksjfhkj
  • Had to rename a column I made earlier to calculate week numbers on the hfjkdshfjkdshkhj Table. Checked and the rename did not effect the graphs.
  • Had to delete all the columns I created due to errors in Damages 068 to be able to get the JOIN to work
  • Used the CROSSJOIN function to create a new table joining 068 Damages table and fhsjdhfjdhsjkfsdfhj table 
  • Interesting that this JOIN did not JOIN through the article number declared in the relationship between the tables. However it just seems that it has placed the data from one table right next to the other. It didn’t work because this formula doesn’t use relationships to define how the tables are combined
  • I thought about using UNION however the tables have different columns and UNION is used to extend one table with the other using the same columns
  • CROSSJOIN has produced over 7 million rows whereas the original data on both tables had less that a couple of thousand
  • I have decided to delete the JOIN and attempt to create a function that will SUMIF across multiple tables. I will do this another day
  • Just applied the following formula to the new column in Damages 068 and for some reason it works. I need to do more investigating to figure out why it works:
    Calculate = CALCULATE(SUM(‘hfsjkdhfkjdsh‘[Qty     ]),’Damages 068′[Art No]). I didn’t reference the column in the hfjkdshkfjdhsjkjfkd column

10/07/2022

  • Created relationships between all the tables in PowerBI. Linking them all together with the article number having “Damages 068” to be the central point. I also extended that to parts in Spartacus to be able to know the article number of the parts we need in Spartacus.
  • DAX is a full database programming language. Meaning its interactions with data isn’t anything like Excel. Due to Microsoft Learn saying that it is similar to Excel and someone else saying “if you know excel, then you understand PowerBI” I was mistaken in my understanding on how the language works behind the scenes. There are commands similar to R like “GROUP BY” and other commands such as this. I must explore an alternative option to be able to SUM a column depending on the Article Number.
  • Found a way to calculate online. I have included the ALLEXCEPT function into the formula and it worked! There must be some other filters at play in this table that I am not aware of. I kept seeing the filters comment pop up in many places, however I assumed I didn’t have any filters applied to the table since I haven’t applied any myself. But it worked. So now I have a column showing the full quantity of damages in Astro and the full quantity of damages from the sharepoint list. Now to apply date filters.
  • Interesting, everytime I make a new formula referencing the other table. The DAX formula I put in which doesn’t ignore filters actually comes back with an error. Simply applying filters on another column can mess with the filters from that. Got to make sure I override filters in the future.
  • Applied ALLEXCEPT formula to the column calculating the total of the hfjsdhfjkdskjhkhfsd. It seems like it is now not being affected by what I do for other columns 😀
  • Added a new column with a DAX formula to calculate ehtes and getsg for the last 16 days
  • Added a new column with a DAX formula to calculate Registered damages in the past 14 days. The reason I did this was to allow 2 days over for article to travel from being registered to recovery. I want to see how many articles are ghost (registered and not brought) or illegal (brought but not registered).
  • Created a column to show whether we received articles that haven’t been registered, didn’t receive articles that have been registered
  • Duplicated all the new columns into the hfjkdshfjkdshjkf column
  • Finally created Cards to display the ghost articles (registered as hhsju and not coming to recovery), Illegal articles (articles not registered and coming into recovery) and Articles that are okay
  • Using the CALCULATE function is heavily dependent on the table relationships
  • I imported new data about transaction types
  • I added a function that takes away hsfjkd from jfkldsjlkjfkldsjl in the table. I realised that I need to add a date and just bring out code hfsjd

11/07/2022

  • Under the further analysis tab, there is a way to be able to put in a constant line through the table. So I can delete the data on targets and instead program the targets more directly in a cell in the visualisation settings.

12/07/2022

  • Was able to create an Article Name section in the Component Inventory sheet. Had to use a combination of functions since there is no SPLIT function in DAX. I found a solution on the internet which told me I can use the LEFT function with SUBSTITUTE (to replace the spaces with dashes able to prepare the data for…) SEARCH function (which brought a character number when it detects the symbol SUBSTITUTE created)
  • Created a stacked column chart for the top 10 articles that have been moved in Spartacus for different purposes. Also applying a filter.
  • Created a new column in Damages068 to include User ID’s that each of us use for Ikea, brought from another User ID that was given to me by Sharepoint Lists due to privacy concerns. I figured out who’s number was who and created an IF statement inception on top of each other to get the Ikea user ID’s.
  • Created a Slicer that displays the different Users. It doesn’t seem to completely work since the tables are looking at the relationship between article numbers rather than UserID
  • I created another column using the EXACT and MID functions to show if the component column has certain text in. If it does, it means it is an orphan. This can give me the Orphan data.
  • Pulled new data from Spartacus to find out about stock replenishment of parts. Created a new column using the same formula as before to create a column with article names. Created a cluster column chart and applied filters to show how many parts are mostly coming into Spartacus and what the article name is.
  • Added a column chart that includes the amount of stock we currently have and put in a temporary line that declares how much we should have. This line will be changed for an actual list of values in the future. I would have to combine data for that and perhaps register that data in a separate database. Perhaps I could make that database with a formula that will automatically detect how much we use of certain parts and would know how much we should have in stock to determine future stock levels. 

21/07/2022

  • Looked at new visualisations to display KPI’s to show more data, experimented
  • Realised I needed a new way of visualising week numbers and have it in order, created a calendar table for the task starting from today connected with the Damages 068 table since it is directly connected to PowerBI and everything else is relating to this table
  • Creating a relationship between the Damages 068 table and the new calendar table
  • For some reason, having added .date to declare it is a date on the formula for week number broke when I created the relationship. Removed the .[date] and it solved the problem
  • It did not work
  • Adding a year to calendar table
  • Later I will add a sum of different Quantities for each entry to the calendar to be able to show the data from one table for all the charts.

22/07/2022

  • Created a column in calendar table for the Repack quantity
  • Created a column in calendar table for the repack quantity a week back
  • Changed title of Repack quantity
  • I found there wasn’t any point in going on that journey since the data is the same that I retrieve afterwards. I simply solved the issue by applying a filter for the last 3 months
  • Changed Repack chart to a stacked column chart and then under the analysis column, inserted a constant line and replaced the weekly target table data
  • Turned on the “Data Label” as well adding the label right above the line to indicate that it is the repack target
  • Change the maximum y-axis range to be 125 on the Repack Quantity chart
  • Deleted Previous week column in the hdshjjfdshjfdsu table. Can be using the data from Calendar to show previous week
  • Applied filter for the last 3 months instead of year on decisions per week
  • Change title to reflect that it is for the last 3 months
  • Found we have a week 30 in week 29 on the Decision chart. Changes week number in damages 068 chart to reflect that week should start on Monday. Week number may not be Scandinavian week numbers. Changed the DAX WEEKNUM function to type 21 and it has solved the issue. I should do this for all week numbers in all tables.
  • Changed Week number return type in hsfueisiwofofs, hfsjfhdjskhfds & Calendar.
  • Marked the Calendar table as the data table
  • Edited the dashboard heavily to look better. Deleted all the title card and only using the titles of each table
  • Attempted to create a column for yesterday’s scrap decisions quantity for the decisions but struggling to create a simple DAX formula. Will look at it tomorrow with fresh eyes

23/07/2022

  • Edited titles of Quality Cards
  • Changed around location of cards
  • Turned background off on Spartacus Flow and Todays decisions to ensure that the card is revealed that is overlapping the border
  • Formatting each visual to ensure they do not interact with data that isn’t relevant or connected to the task
  • Created slicers that show the individual users. Had to create 3 since the relationships between the tables are based on article numbers and not user ID’s. It could not slice through different tables
  • I have attempted to look at being able to create columns that show the previous month’s calculation for the different cards showing different data. So I can then make a comparison with a KPI visual and show whether we have improved or not. I attempted to create a CALCULATE function in DAX using different methods. I had copied the formula over that connected to a different table and it worked perfectly. However, I kept having blanks returned. I had spent a long time attempting to figure out what was wrong and I realise that it doesn’t matter so much since there is only one card that will benefit from this treatment. So I have decided to leave it.
  • Deleting the target tables since I am using the lines instead to indicate the targets and reg, not reg since I made a column in Damages 068 for that
  • Deleted all the non-working columns in Calendar
  • Created a hsfhsjdkfdskjlfdsh column in Calendar
  • Added Trans type table from data pulled from Astro showing transactions movements of articles in the warehouse, filtered in Astro to show only us in Recovery
  • Attempted to add a SUM of hhgsy depending on the date. Got as far as being able to filter out the transaction type but not the date
  • Went back into Transform Data and applied the same treatment on the dates column as I had with the others. Splitting and merging into the correct format. Afterwards the date worked in the new column however I don’t seem to be able to apply the filter at the same time. So I now have the filter working individually and the dates working individually. But not together. I also had to create a new relationship between trans types and Calendar tables

25/07/2022

  • Changed cards  that display decision mistakes from filtering for only version 2,3 and 4 to an advanced filter saying “not 1”. Just in case we get an entry that has version 5 or more
  • I finally managed to figure out the hfsdio problem using SUMX
  • I created a column with repack – hhg
  • Then created a column for hhget – 1 week to be able to compare in a chart
  • Then created a repack – ghted – 1 week column
  • Took out the year in the week number column
  • Created an overall data set for all transactions and needs from Astro and slowly adjusting formulas to get data from that dataset
  • Deleting columns Calculate full ghtvehsielts Qty and Calculate Full Damaged Qty in Damaged 068 database
  • Been wanting to create everything coming from one dataset. Created a new “Trans Types” to include more users and trans types. So I can create formulas to get the specific data I need rather than using separate datasets to make it easier to update.
  • Already created new repack quantity data in Calender
  • Attempted to create a new column for Calculating 16 days hgsetitsshtes in Damages 068. Need to come back to try and figure out why it is not working.
  • Had to remake the new column for the calculate 16 days tues0f t s uts from scratch in the Damages 068 table. Also found the original wasn’t displaying proper data. Problem solved and now the column is connected with the new table from Astro not the old
  • Created new column for Not reg, reg with the ABS function on it to ensure that there aren’t negative numbers shown on the dashboard. Then changed the card showing what has not been registered and brought to recovery in the last week
  • Deleted the hgtsieigtshetis tables from model and now checking the effect it has on the other tables and dashboard
  • Calander has a lot of columns that have come up with errors that aren’t being used for any visuals. I had also updated the necessary ones into new columns with formulas connected to the new table. I have deleted these columns
  • Found Repack quantity x –hgt column came up with error. I corrected the formula to correct the error.
  • Edited the new hjy column in the calendar dataset to include userid’s
  • Deleting the original Trans Types which just included us in Recovery, deleted “Damaged Locations” since I do not need the dataset for the dashboard

26/07/2022

  • Adding Sample sign over Spartacus stack chart since we don’t have Spartacus in this store

27/07/2022

  • Redid the articles not registered column in the damages 068 dataset to change the days to more accurately represented a week
  • Took out LSC RT out of calculation to figure out if there was any registration problems in the “Calculate 7 days registered” column in Damages 068 through DAX
  • Changed the colours of the articles not registered, registered and not arrived and the good registered articles to represent more what they are.
  • Created a new table with unique values for article numbers and added columns that determine whether or not an article has been registered properly or not. This done since I was summing a sum of the same article in the original chart and producing big numbers, it was unusual.
  • Edited visuals to represent the new numbers
  • Had to change the data being used from being on the folders on my computer to the sharepoint using sharepoint folders so everyone else has access to the data that PowerBI runs off. I did this since I was unable to link the data to PowerBI online and found that we were being put into a database with others. We found it easier to get everyone to use the desktop app to access the dashboard.

01/08/2022

  • I found out about an attachment to Astro called OpView which shows data from Astro live as dashboards or Opdoc, which sends reports directly to your email after setting up a SQL query. I realised that I could possibly use this to pull data through email and then use PowerAutomate to pull the data from email.
  • After some time I realised that I could possibly use PowerAutomate to do the whole process. I will investigate this further now.
  • I added a slider that will let the user decide which weeks they want to look at when looking at the percentage of each decision made.

02/08/2022

  • Attempted to create a Power Automate routine to pull the data we need for PowerBI automatically. I attempted to create several scripts but discovered that it is not possible to login using Power Automate, the dates need to be inputted manually in Spartacus and there seems to be a problem filtering on Astro. I can go back and create a SQL query in Astro to pull the data but need to ask the right people to help me make this happen. I also need to use add-on’s to Astro called OpDoc to send me email updates with the data needed for PowerBI which I can then use PowerAutomate to download and save the attachment.