lookupadds data to each existing event in your result set based on a field existing in the event matching a value in the lookup
inputlookup takes the the table of the lookup and creates new events in your result set (either created completely or added to a prior result set)
outputlookup takes the current event set and writes it to a CSV or KVStore.
As an aside, when getting started with SPL commands, the Quick Reference Guide is the holy grail IMO for learning all about Splunk key concepts and common commands, along with different examples. Make sure you've got this one in your back pocket, as well as the Search Reference Docs.
Yes, you can lookup two tables in the came command. You can even join the two tables together. It really depends on what you're trying to do with the lookup (whether you're trying to use multiple inputlookup calls, or multiple lookup calls).
The former requires the use of append or join:
| inputlookup lookup1| append [|inputlookup lookup2]| join ip [|inputlookup lookup3]
The latter is just sequential:
index=<index> sourcetype=<sourcetype>
|lookup lookup1 ip
|lookup lookup2 host
OR
|inputlookup3
|lookup lookup1 ip
|lookup lookup2 host
To add multiple lookup files to a search, this should work for Cluster map and Choropleth Map
You can just stack the lookups
IP manipulation removing port
rex command:
Extracts the IP address portion from the field_with_ip_port field.
The regex (?<ip>\d{1,3}(\.\d{1,3}){3}) captures any valid IPv4 address.
stats count by ip:
Groups the results by the extracted ip field.
Counts the occurrences of each unique IP address.
Count the location of a IP
Extract values from raw entry
With a default data set
Splunk grabs the data little different the dot notation for whats inside of the details object, the SPL language uses spath to extract the raw information
| lookup c2cisp.csv ip as d_ip OUTPUT ip as c2cisp | search c2cisp=*
| lookup IP.csv ip AS ipAdd OUTPUT ip AS match_ip
| where isnull(match_ip) | where ipAdd != "123.123.123.123"
| stats count by userDisplayName, ipAdd | sort - count
your search....| sort -count
your search....| sort -_time
where ClientIP IN ("86.48.9.97", "92.119.17.191")
| iplocation ipAdd
| geostats latfield=lat longfield=lon count by userName
| iplocation ip
| stats count by Country
| rename Country AS country count as numb
| sort -numb
| geom geo_countries featureIdField=country
source="activity" load=Directory Op=Logged
| lookup Microsoft.csv subnet AS CIP OUTPUT subnet AS matched_subnet
| lookup IP.csv IP AS CIP OUTPUT IP AS matched_subnet
| where isnull(matched_subnet)
| iplocation CIP
| geostats latfield=lat longfield=lon count by UserId
index=your_index sourcetype=your_sourcetype
| rex field=source_IP "(?<ip>\d{1,3}(\.\d{1,3}){3})"
| stats count by ip
sourcetype="activity"
| spath Operation
| search Operation=FileAccessed ClientIP!=123.123.123.123
| lookup MiUm.csv subnet AS ClientIP OUTPUT subnet AS matched_subnet1
| lookup GEP.csv GEIP AS ClientIP OUTPUT GEIP AS matched_subnet2
| where isnull(matched_subnet1) AND isnull(matched_subnet2)
| iplocation ClientIP
| stats count by Country
| sort count
category: status
details: {
new: [
{
name: status
value: offline
}
]
old: [
{
name: status
value: online
}
]
}
device: {}
index=* "UserAgent"
| rex field=_raw "Name:\s*UserAgent\s*\n\s*Value:\s*(?<UserAgent>.+)"
| rex field=_raw "ClientIP:\s*(?<ClientIP>\S+)"
| lookup suspicious_user_agents.csv http_user_agent AS UserAgent OUTPUT http_user_agent AS Match
| eval Match=if(isnotnull(Match), "Suspicious", "Normal")
| lookup ip.csv IP AS ClientIP OUTPUT IP AS Match
| where isnull(Match)
| stats count by UserAgent
| sort - count
* Operation=FileDownloaded
| stats count as download_count by UserId
| where download_count > 200