Best Practices
Oracle recommends the following best practices for setting up and using Oracle Log Analytics:
Topics:
Best Practices for Configuring Oracle Log Analytics
-
Plan Your Log Groups:
In Oracle Log Analytics, log groups serve as logical containers for organizing and storing collected logs. Each log group resides within a specific compartment, facilitating user access control. By assigning appropriate permissions at the compartment level, you can manage which users have access to the logs contained within the log groups of that compartment.
See Create Log Groups to Store Your Logs and Manage Log Groups.
Careful planning of log groups ensures:
- Efficient Access Control: Assign appropriate users and groups access to specific log data.
- Scalability: Easily adapt the structure as your organization grows.
- Compliance and Security: Support regulatory needs by properly isolating sensitive logs.
- Ease of Maintenance: Simplify querying, alerting, and troubleshooting.
Approaches to organizing log groups: Consider the following common strategies.
- By Log Type: Group logs by their type, such as access logs, audit logs, security logs, or application logs.
- By Entity or Environment: Organize logs by servers, applications, business units, or environments (for example, Production, Development, Test).
- By Customer (for Service Providers): Create separate log groups for each customer to ensure isolation and customized access control.
Example scenarios:
-
Example 1: A service provider has two compartments, Operations which stores basic operational logs and Secured Content which contains logs that need to have restricted access because they have sensitive information in them. Each compartment can have many log groups. For example, Operations compartment has Server Logs and Access Logs. Secured Content compartment has Audit and Transaction logs. Using OCI IAM Policies, the service provider gives Operators user group access to Operations compartment and Auditors user group access to Secured Content compartment. Each user group can only view logs for the compartments that they have access to.
-
Example 2: A security software company has four compartments, WebAccess, DBA, Authentication, and Endpoint with log groups corresponding to specific business needs. Access control is managed at the compartment level, aligned with organizational roles.
Planning Questions: Before creating log groups, consider the following.
- What types of logs will be collected?
- Which teams or users need access to each log group?
- Are there regulatory or business drivers for isolating certain logs?
- Will the organization need to scale or reorganize log storage in the future?
Recommended best practices:
- Engage Stakeholders: Involve operations, security, and compliance teams when planning log group structures.
- Align with IAM Policies: Ensure log groups and compartment structures align with your OCI IAM policies to streamline access control.
- Avoid Over-Grouping: Avoid grouping all logs together, as this can complicate security and maintenance.
- Document and Review: Document your log group strategy and review it regularly to ensure it continues to meet organizational needs.
Tip: Investing time to plan log group organization upfront streamlines log management and security, reducing maintenance overhead in the long term.
-
Use Labels for Efficiency:
Labels are additional texts that you can add to a log entry. The additional text can be from a predefined library of labels that indicate some common types of signatures in logs such as
Authentication Failure,User Logged In,Application Shutdown. These strings get added to the multi-valuedLabelfield.A label can also be adding an arbitrary value into any field in the log entry. For example, if a log entry contains a numeric status code like
404, then you can add a text string such asNot Authorizedto a fieldStatus Message.The label field or the output fields populated in a label definition can then be used in your queries just like any other field.
Labels are defined in the log source and are evaluated as the log data gets ingested. Label definitions will only be applied to new log data that is ingested after the label definition is added to the log source. Historic ingested log data will not get enriched with newly added label definitions.
By enriching your logs at ingest-time, you can:
- Increase the speed of your queries because complex evaluations were already done.
- Simplify your queries to make them more readable.
Consider the following example query written in the Log Explorer:
'Log Source' = 'FMW WLS Server Access Logs' and (URI like '%/services/loans/accountcreateservice%' or URI like '%/services/agreements/history%' or URI like '%/services/transferservice%' or URI like '%/services/update/adjustmentservice%' )This query can be very expensive because it is performing a wildcard search for four strings. If performed over a long time range, it can time out. Additionally, if the query is repeated often, say in a dashboard or an alert, it can draw a lot of processing resources.
An alternative is to create four label conditions in FMW WLS Server Access Logs source as below:
If URI contains ''services/loans/accountcreateservice' set service=loanaccountcreate If URI contains ''/services/agreements/history' set service=agreementhistory If URI contains ''/services/transferservice' set service=transferservice If URI contains ''/services/update/adjustmentservice' set service=adjustmentserviceThen the query can be rewritten as:
'Log Source' = 'FMW WLS Server Access Logs' and Service in (loanaccountcreate, agreementhistory, transferservice, adjustmentservice)This is much easier to read, understand, and it also is a much better performing query.
You could also create a single label definition if the URI contains any of the four patterns above, add a label like
Watched-URI, then your query could be:'Log Source' = 'FMW WLS Server Access Logs' and labels = Watched-URIYou can also create detection rules to detect incoming logs with specific conditions at ingest time. See Create a Label, Use Labels in Sources, Detect Predefined Events at Ingest Time, Oracle-defined Detection Labels, and Filter Logs by Labels.
Best Practices for Logging
Following are the best practices for how to configure your applications and systems or author your software to emit logs:
-
Log Your Timestamps in UTC: This helps in avoiding gaps due to Daylight Saving Time (DST) switching. The timestamps can be converted to your local time when viewing the logs in the Log Explorer.