cancel
Showing results for 
Search instead for 
Did you mean: 

Hostkey Fingerprint in BODS 4.3

0 Kudos

Looking at the manual and the SAP notes that are available, I've generated the hostkey fingerprint value. However, Designer is complaining that it doesn't like what I provided.

The command used is: ssh-keygen -E md5 -lf /etc/ssh/ssh_host_ecdsa_key.pub

The return value follows this format:

256 MD5:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx no comment (ECDSA)

I've tried using different versions of this, with/without the "MD5:" and "no comment (ECDSA)". Nothing is working. The local machine can connect using putty, SFTP, FileZilla, etc just fine. Command line SFTP will connect without a password after setting up the public/private keys. But the BODS File Location says that it's an "invalid host key fingerprint value."

Designer version: 14.3.0.113

View Entire Topic
jmuiruri
Product and Topic Expert
Product and Topic Expert

Hi eganjp

There is a trick you can use just to get you going. Ignore the error message and save the File location Object, then create a simple job that reads or writes to the file location object.

Run the job and it should fail with an error message similar to

The SSH session terminated because the host key fingerprint <xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx> does not match
the host key fingerprint <yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy> for file location object <>. For more
information, see SAP Note 2838796.

Copy host key fingerprint yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy:yy and replace the host key fingerprint in the File Location Object with this fingerprint.

Best Regards,

Joseph

jmuiruri
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi eganjp ,

In your example below just the bold section.

256 MD5:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx no comment (ECDSA)

Regards,

Joseph

Thanks Joseph! The File Location is connecting now.

This is sort of related to the issue of getting the right Hostkey Fingerprint. I can start a new thread if necessary.

My File Location objects need to be exported out of the Development environment and imported into the Test environment. This is done along with many other objects, but the issue is with the File Locations. The migration has to be automated, using the AL_Engine command line executable for both the export and import. The Hostkey Fingerprint and the login password are both encrypted in the ATL file.

I need to know how to use the -epassphrase parameter of AL_Engine. The passphrase isn't provided as plain text on the command line, right? If the passphrase has to be encrypted, how do I do that with the AL_Encrypt executable?

jmuiruri
Product and Topic Expert
Product and Topic Expert

Hi eganjp,

First you need you need to encode the password to base64 and then you can use the encoded password during export and then during import if you use the command line you can still use the encoded password. However when you import the objects via the designer you need to provide the decoded password.

## In linux to encode password to base64 you can do for instance
$ echo "Password1" | base64
UGFzc3dvcmQxCg==
## al_engine -U<db repo user> -P<db_repo_pass> -S<db_server_name> -N<db_type> -Q<db_repo_name> -Xp@<objecttype>@<filename>@<objectname>@DE -epassphrase<base64 encoded pass>
## Example below

$ al_engine -Usapdsuser -P******** -SPXX-YYYYY.sap.com -NMicrosoft_SQL_Server -QPXX_YYYY7_DS -Xp@P@exported_projects.atl@Benchmark@DE -epassphraseUGFzc3dvcmQxCg==
### In the above example i want to export Project Benchmark and all it's dependent objects

When i want to import in designer i will provide Password1 as the password instead of the encoded password.

Hope this answers your question.

Best Regards,

Joseph

Perfect! I think I have it working now. I used al_encrypt to give me the base64 version of my passphrase. Your examples were very helpful.

The background I gave is a bit more simplistic than what I'm actually doing. I wrote code to generate Data Services code for a File Location, File Format, Dataflow and Workflow and tie all of them together. Each object type is written to an ATL file and then imported into the repository. I have a large number of files (in different directories on the same secure host) to bring in to different tables in our staging area. There are data type specific mappings in the Dataflows that transform source data into what I need it to look like in the target table, so it's not just a one-to-one mapping. Each time the source system makes a change, we'll have to redo all of the objects. Doing this manually would be quite tedious. So having a way to automatically generate all the objects is going to save a ton of time and ensure that there are no mistakes. At this time, I have 113 tables to populate. That will grow in the future.

jmuiruri
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi eganjp,

Sounds exciting, have fun!.