One of the things that I have noticed when setting up Oracle Management Cloud for Log Analytics is the amount of housekeeping that HASN’T been done on log files. I recently did a deployment for Log Analytics on a Web Logic Server and it took me a while to find the actual log files which were currently in use. In fact this application server had two Web Logic Installations running (and about 2 more which hadn’t been removed).
So with all these Log files which are actually hanging around you can put them to good use and load them into Log Analytics. There are two reasons to do this…one for fun!!! (Come on it’s Christmas). The other is to load in log files which related to major incidents that you had prior to you running Log Analytics.
This functionality is highlighted here but as you can see it’s not been well documented.
So to undertake the upload on demand you can just use a simple curl command. Identity domains and database names have been changed to protect the innocent.
curl --insecure \ -u 'firstname.lastname@example.org' \ -X POST \ -H 'X-USER-IDENTITY-DOMAIN-NAME:redstack' \ --form 'data=@C:\cygwin64\alert_orcl.log' \ "https://redstack.loganalytics.management.us2.oraclecloud.com/serviceapi/logan.uploads?uploadName=Upload1&targetName=orcl&targetType=omc_oracle_db_instance&createTarget=true&logSourceName=DBAlertLogSource&logParserName=db_dbalertlog_body_logtype"
When I came to do this finding the relevant internal name for the URL parameters &targetType, &logSourceName and &logParserName was difficult. However if you go to an existing OMC agent inst home and take a look at the logrules_os_file.xml file in the application state directory you can see all the internal names. The other thing to note here is the parameter createTarget=true as I wanted create a new entity with this particular log upload.
If you have done the upload correctly it will return a JSON output. Also note that you can load up log files or log file zips.