Hi all, we are hiring a remote worker and will be supplying a laptop to them. The laptop will be running a Debian variant of Linux on it.

We are a small shop and this is the first time we have entrusted somebody outside of our small pool of trusted employees.

We have sensitive client data on the laptop that they need to access for their day-to-day work.

However, if something goes wrong, and they do the wrong thing, we want to be able to send out some kind of command or similar, that will completely lock, block, or wipe the sensitive data.

We don’t want any form of spying or tracking. We are not interested in seeing how they use the computer, or any of the logs. We just want to be able to delete that data, or block access, if they don’t return the laptop when they leave, or if they steal the laptop, or if they do the wrong thing.

What systems are in place in the world of Linux that could do this?

Any advice or suggestions are greatly appreciated? Thank you.

  • AndrasKrigare@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    55 minutes ago

    Fundamentally, once someone has some of the data, they have that data, and you can make no guarantees to remove it. The main question you need to ask is whether or not you’re okay with limiting it to the data they’ve already seen, and what level of technical expertise they need to have to keep the data.

    Making some assumptions for what’s acceptable as a possibility, and how much you want to invest, I’d recommend having the data on a network-mapped share, and put a daily enforced quota for their access to it. Any data they accessed (presumably as part of their normal duties) is their’s, and is “gone.” But if you remove their access, they can’t get any new data they didn’t touch before, and if they were to try and hoover up all the data at some point to copy it off, they’d hit their quota and lose access for a bit (and potentially send you an alert as well). This wouldn’t prevent them from slowly sucking out the data day after day.

    If they only need to touch a small fraction of the customer data, and particularly if the sensitivity of the data goes down over time (data from a year ago is less sensitive than data from a day ago) this might be a decent solution. If they need to touch a large portion of the data, this isn’t as useful.

    Edit: another nice bit is that you could log on the network share (at your location) which of the customer data they’re accessing and when. If you ever want to audit, and see them accessing things they don’t need, you can take action.

    I think the next best solution is the VDI one, where you run a compute at your location, and they have to remote into it. If they screen capture, they’ll still save off whatever data they access, and if they have poor, or inconsistent, connection up your network it’ll affect their ability to do their job (and depending how far away they are it might just be super annoying dealing with the lag). On top of that, it’s dependent on how locked-down they need to be to do their job. If they need general Internet access, they could always attempt to upload the data somewhere else for them to pull it. If your corporate network has monitoring to catch that, you might be okay, but otherwise I think it’s a lot of downside with a fairly easy way to circumvent.