Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
JonGooding
Product and Topic Expert
Product and Topic Expert
This is the how-to from the more business context post on Linked-in on how I calculated my Carbon Emissions from travelling using Garmin and Strava data.

The Process is essentially:


The data I am using for this V1 of the analysis comes from my Garmin watch (or used to be iphone), but essentially all the data is centralised on Strava.

it should be noted that extracting the data from Strava, doesn't easily provide the first GPS location of each event, so it needs to be extracted from the individual events file that is associated to the activity.

Data Preparation (Ideally refactored in SAP Data Intelligence)

  1. Using the Strava Bulk Data Export Feature, I am able to get a file export_<userid>.zip of all my activities

  2. Copying this file to a linux server, and performing some data preparation, and extracting the data into 4 areas:

    1. Activities.csv - Strava enriched with each event

    2. GPX Files - Details of event in GPX format

    3. TCX Files - Details of event in TCX format

    4. FIT Files - Garmin Binary format



  3. Using some open source code, and modernising it for SAP HANA Cloud Target, each of the files were imported into HANA Cloud

    1.  Activities.csv - Using pandas and hdbcli

    2. GPX Files - PHP based on XML extracts

    3. TCX Files  - PHP based on XML extracts

    4. FIT Files - Python based on fitdecode




HANA Cloud




  1. Noting that each of the events does not have a location, so the activities table needed to be linked to the extracted events dataset

  2. To match each of the source files, the following was received:

    1. Activities - Created simple table to match

    2. Source_Activity_Details:

    3. The initial location of the activity is linked, via a view : This was initially displayed in dbeaver

    4. Linking the location of the previous event to the next event, based on time, the row_number function was used to group on the Activity Date.

    5. Then using a relatively based Geospatial predicate. Also to make sure you are using the Geospatial Reference SRID 4326 so that a trip from Australia over the dataline to the Cook Islands - is not going via Europe and Americas.
      	case 
      WHEN (a.geo.st_distance(b.geo) / 1000 <= '10')
      THEN '0'
      WHEN (a.geo.st_distance(b.geo) / 1000 between '10' and '200')
      THEN (a.geo.st_distance(b.geo) / 1000) * '0.00003'
      Else (a.geo.st_distance(b.geo) / 1000) * '0.00015'
      END as emission,
      case
      WHEN (a.geo.st_distance(b.geo) / 1000 <= '10')
      THEN 'Run/Bike'
      WHEN (a.geo.st_distance(b.geo) / 1000 between '10' and '200')
      THEN 'Car'
      Else 'Flight'
      END as ModeT​

      The numbers to work out the emission multipliers were based on a range of sites. In the next version, Aircraft emissions APIs could be used.

    6. Having the Activity Date, Event Details, Location, Emission as a technical view (called FINAL) - this will be used a source to SAP Data Warehouse Cloud to allow of different models to be easily created.




SAP Data Warehouse Cloud

  1. SAP Data Warehouse Cloud normally is showcased with SAP and non-SAP data. This use case is completely non-SAP data.

  2. Creating a connection to the HANA Cloud instance

  3. Access the remote table objects from the HANA Cloud connection, including the FINAL view and ACTIVITIES table. The remote tables have been created as a virtualised dataset.

  4. Build the model for the emissions scenario, also utilising standard time objects in the model:5. The view also has both the start location and end location linked to the activity. These locations are views that store the st_point (so it can easily be put on a map) using the Spatial Reference ID of 3857


SAP Analytics Cloud

  1. Using the view from SAP Data Warehouse Cloud, easily able to use the models to display the Live connection. Some sample pages (without styling applied):

  2. All Years Stats:Using the SAC Flow Layer Feature:Specifically for 2022 (filtered and highlighted):


Additionally:

Both HANA Cloud and SAP Data Warehouse Cloud were used to allow a variety of options of using this data down the track. It could have all been done in SAP Data Warehouse Cloud for simplicity.

Thanks to Kunnal Khanna for building out the HANA Cloud views for me.

Also please note: the above process has been scripted, and will be available for other users when ready. It's not quite ready to be shared publicly as yet - as I want to test with a few other user data sets first.

And yes, there planned updates:

  1. Using Activities of type "Commute" (on bike or walk / running) to be negative on my Charts

  2. Using Aircraft API CO2 emission calculators to more accurately calculate the CO2.


Happy to take more feedback and hopefully soon - publish the process scripts so others can do this.
2 Comments