Using Splunk with 1Password Business

1Password Business makes it easy to monitor events that happen on your team using the Activity Log, and you can take that to the next level by adding Splunk to the mix. Using the 1Password command-line tool, you can send your team’s 1Password activity to Splunk and keep track of it there alongside other happenings within your team.

One of Splunk’s most popular features is the ability to find events and trigger alerts based on them. For example, in your team you could set things up so the sysadmins are alerted whenever someone is added to the Owners group in 1Password. I’ll get into that example a bit more later in this post.

Set up the 1Password command-line tool

To kick things off, let’s set up the 1Password command-line tool, if you’re not using it already:

1Password command-line tool: Getting started

When setting up the tool, start by creating a custom group and giving it the View Admin Console permission so it can view the Activity Log, then add a user to that group. Once the tool is set up with that user’s account, get a session token:

$ op signin example

This will allow you to interactively enter the Master Password with secure input. Since you’re definitely putting this in a script, you’ll want to pass the Master Password through stdin to the op signin call to get your session token:


To make things simpler, you can omit the email address and Secret Key from op signin since they are saved in ~/.op/config. You can then simplify the whole sign in step to one line by piping the Master Password to it:

gpg -q --decrypt password.enc | op signin example

To automate all this, though, you can get the Master Password from a secure storage location and pipe it to sign in. A HashiCorp vault is a good place to securely store the account’s Master Password. I’m using GPG in this example, but you can use KMS or something else that you’re comfortable with – just avoid echo. 😉

Start fetchin’ those audit events

Now that we have our session token, we can start getting some audit events. Create a script that’s run by a job scheduler such as cron at regular intervals (every 10 minutes should suffice). That script needs to:

  1. Create the session like we just did above.
  2. Read the last processed event ID from disk.
  3. Fetch events newer than that ID.
  4. Send the events to Splunk.
  5. Save the latest event ID to disk.

To do this, we’ll be working with JSON, so JQ is a good idea if you’re working with bash; you could also use a scripting language that supports JSON, such as Python or Ruby.

You can fetch up to 100 events newer than $ID. To fetch them:

op list events $ID newer

To make sure you get all the events, you’ll need to run that until nothing is returned, since only 100 events are returned each time. This command will return a JSON array of event objects like this:

 "eid": 392879,
 "time": "2018-01-23T15:50:49Z",
 "action": "join",
 "objectType": "gm",
 "objectUuid": "hd22y2bob6qdpap2ge6d7nn4yy",
 "auxInfo": "A",

You can send all of the events in the array to Splunk at this point by using something like the Splunk universal forwarder.

Next, take the eid of the first object in that array and save it to disk so it can be used for the next fetch. If the array from op list events is empty, it means there are no newer events, and you’re done here — for now.

Get alerts about important actions in your team

Earlier I mentioned one such handy use for Splunk with 1Password Business would be to see when someone is added to the Owners group. To do this, you would find an event in the Activity Log that has:

  • action: join
  • objectType: gm (Group Membership)
  • objectUuid: your Owners group’s UUID, which you can get by opening, signing in, and clicking Owners, then copying the UUID from the end of the address bar in your browser.

Every audit event comes with a actorUuid field. It’s a great identifier, but when perusing, we have no idea who YJTZ3RWWFRBNTF4M2YEEY3EPOQ is. To fix this up, let’s upgrade our script a bit. Before we fetch events, let’s get a user list with op list users. This will get us all users on the account along with some basic information like their name and email address. With that we can process each event object, look up the user by UUID, then add more descriptive information for when we send things to Splunk.

In this example case of sending an alert when someone is added to the Owners group, it’s probably nice to know who was added. The auxUUID field of the audit event will be the UUID of the user who was added to the group. You can do the same lookup that we did above for the actor. For many events, auxUUID will not be a user UUID, so make sure to fail gracefully there.

Now that we’ve set things up, whenever Splunk finds an event matching this, it’ll be able to alert your sysadmins via Slack or another method and let them know that Lorraine added Bobby to the Owners group. From there, they can take action if they need to.

Try it out and tell us what you think

When it comes down to it, sending your team’s 1Password activity to Splunk gives you one place to audit any administrative action your team has been taking in 1Password, alongside all the other tools your company uses. There are a lot of things you can look out for, from the Owners group example I mentioned before to knowing when someone adds or removes a team member from a vault or changes their permissions.

We’d love to hear how you set things up, so feel free to comment below or send us a message at or start a discussion in our forum with suggestions, questions, and anything else you’d like to chat about!

4 replies
  1. Peter Sagerson
    Peter Sagerson says:

    This definitely sounds like something I’d like to play with, with one exception: it’s hard to imagine any circumstances under which I would expose my master password to automation. Especially as an account owner. Hopefully you’re already hard at work on some notion of non-user accounts (equivalent to GCP service accounts) for such purposes. Alternately, some scheme to derive long-lived reduced-access authentication contexts (credentials or sessions) from one’s account would also do the trick.

    • Rick Fillion
      Rick Fillion says:

      Hi Peter,

      You’re right, this is a pretty major concern. We quite like how we’ve solved this problem with the 1Password SCIM bridge, which is based on the 1Password command-line tool. The SCIM bridge can be thought of as a webservice that automates the administration tasks associated with 1Password, and so it needs a long term secret that allows it to sign in as the user. In the case of the SCIM bridge, you sign in once manually and it provides you with two secrets: one that you provide to your directory service (Okta, Azure ActiveDirectory) that they can use as an authentication token, and the other secret is given to the SCIM bridge itself. Without both secrets, the SCIM bridge can’t do anything. Neither secret, even when combined, actually contains the user’s Master Password. Instead when combined the secrets provide the SCIM bridge with a key that was derived from the Master Password, as well as SRP x which allows it to communicate with the server. These are technically just as sensitive as a Master Password (since both are derived from it), but everyone feels better about the fact that the Master Password itself isn’t involved at all.

      I’m hopeful that we can take some of the lessons learned from the 1Password SCIM bridge and bring those enhancements to the 1Password command-line tool. It would open up the door to a lot of features too, so it’s not just a security improvement. Until that’s the case, the only way to authenticate with the command line tool is to provide it your Master Password.

      We recommend that you always use a user account that has the least power/access possible whenever dealing with automation, and that includes the 1Password SCIM bridge. Reduced-access authentication contexts is a possible solution that we think would still have value even given how we handle the solution with SCIM. There’s a lot that I’d love for us to do regarding security policies. Encryption is always king, but that doesn’t mean policy doesn’t have its place.


  2. Joseph P. Hillenburg
    Joseph P. Hillenburg says:

    Setting aside Peter’s arguments against automation (i.e. let’s assume we are just interested in event auditing), I regard this as an interim solution. 1Password for Teams really needs to have the ability to be directly integrated with Splunk. It would also be nice to see a series pre-defined reports and dashboards as a Splunk app.

    • Rick Fillion
      Rick Fillion says:

      Hi Joseph,

      I think you’re right that this is an interim solution. Everything is an interim solution in tech… we should never be satisfied with what’s currently possible and should always strive to do better.

      There are some interesting challenges to doing a direct integration with Splunk. For a direct integration to work, the server would need to send the data to Splunk. This could probably be done, but its utility would be limited. The server doesn’t have some of the data that you might want. For example it doesn’t have the vault names because those are encrypted client-side and there’s no way for us to decrypt them to include them in what’s sent to Splunk. While we could maybe send all of the raw data (i.e. what you get from op list events), making something that’s more useful to humans requires additional work.

      Last week I was looking at Beats and that’s super interesting. I see a ton of potential in a tool like that combined with our command line tool.

      The challenge for us is always in how to do all of the fun things you might want to do in a way that’s secure and doesn’t put your data at risk. That challenge is why I get out of bed every morning. :)


Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.