Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
BaerbelWinkler
Active Contributor

I recently got pulled into trying to resolve a helpdesk ticket raised by a user department because they had noticed something "off" with delivery note attachments  to several deliveries. For some unknown reason, some deliveries kept getting the same PDF attached over and over again - and not just a few times but really often, like up to almost 140,000 times!

JourneyOfDiscovery-01.jpg

Collecting information

To begin with, we didn't really know where these were coming from and our only lead were the many entries also getting added to the archive link table TOA01. There, we noticed a big jump in daily additions at around November 20. Before then, we had daily additions in the 3 digit numbers range and afterwards it quickly jumped up to 250K entries per day.

JourneyOfDiscovery-03.jpg

I then did a where-used analysis for table TOA01 and found function module ARCHIVE_CONNECTION_INSERT which looked like a promising lead just based on its name! Luckily enough, we have the SAP Call Monitor (SCMON) active in our productive environments, so I went there to see if there were any "interesting" hits for the function module. Not too surprisingly, there was a very clear outlier within the 7 days worth of data readily available:

JourneyOfDiscovery-04.jpg

So, I checked  our RFC-enabled function module which looked fine logic-wise and which also hadn't been changed recently. It therefore couldn't really be the underlying reason for what was happening in the system. I now however had some impacted piece of code to take a closer look at.

Dynamic logpoints to the rescue!

As far as I'm concerned, dynamic logpoints (SDLP) are one of the best options to troubleshoot issues like the one at hand where you can't capture the process stand alone via starting a report or transaction. So, I identified a suitable place in the code and defined a logpoint for a few relevant fields:

JourneyOfDiscovery-05.jpg

It didn't take long for a very distinct patter to emerge in that we had most delivery numbers showing up with slowly increasing KEY COUNTS (the combination of fields in Key Definition) and a few others where the numbers went up far more quickly.

Slow cycle with something between 36 and 40 minutes between each addition:

JourneyOfDiscovery-06.jpg

Fast cycle with something between 1m50s and 2m30s (ignoring the outlier):

JourneyOfDiscovery-07.jpg

Watching this for a few days, the pattern was eerily consistent so that it was roughly possible to tell when numbers would go up again for one of the deliveries simply by knowing if it was stuck in the slow or fast cycle! Just how eerily regular the pattern was, can be seen in this screenshot with only partially masked delivery numbers. I had the dynamic logpoints active for about 8 hours each day, during which time they had been triggered about 74K times each:

JourneyOfDiscovery-08.jpg

Resolution of the issue

By that time we had collected more than enough data to be "fairly" certain that the root cause for the issue was outside our SAP system and had to be with one of our external logistic partners who kept re-sending the same delivery notes over and over again instead of just once. After getting in touch with them, the issue was quickly resolved: the root cause of all of this was a local C-drive running out of space with nobody noticing! Once that was tackled on their end, the number of transmitted delivery notes went down to pre-issue numbers immediately!

Next steps

Now that we successfully stopped the build-up of unwanted attachments we'll need to find a way to get rid of the millions of obsolete attachments in an orderly way with always obviously leaving one attachment intact but deleting all the many others from an impacted delivery. I already identified one most likely helpful function module for this task: ARCHIV_DELETE_META. I'll however have to investigate how best to get rid of the many items without causing issues for the system with these many updates.

So, here is a question for you: did you per chance already have to tackle a comparable clean-up of archivelink attachments and if so, am I on the right track with ARCHIV_DELETE_META or are there other/better options available? If you have any suggestions, please mention them in the comments. Thanks!

1 Comment