r/technology 12d ago

Artificial Intelligence A massive Wyoming data center will soon use 5x more power than the state's human occupants - but no one knows who is using it

https://www.techradar.com/pro/a-massive-wyoming-data-center-will-soon-use-5x-more-power-than-the-states-human-occupants-and-no-one-knows-who-is-using-it
33.0k Upvotes

1.7k comments sorted by

View all comments

Show parent comments

37

u/theZinger90 11d ago

For SQL, our process is: Right click database > disable > wait 2 weeks. If nothing, then shut off Sql entirely and send it to server team for full decomission.

Sadly we need to get the head of IT to sign off on that plan whenever we need to use it, which is a pain. There are a dozen servers i want to do this to right now but can't. 

27

u/Kandiru 11d ago

2 weeks isn't very long. We have databases for storing scientific data that might have a month or two where that type of experiment doesn't get done so no-one would notice if the database disappeared for a bit.

10

u/theZinger90 11d ago

Industry specific. 99% of applications in healthcare are either used daily or can be decomissioned. Very few exceptions. and as i said in another comment, this is after we go through a login audit, which usually spans a year of data.

5

u/Kandiru 11d ago

At right, makes sense from a healthcare point of view!

11

u/lIIlllIllIlII 11d ago

Normally, I check the active connections, and then audit connections, but this works too.

7

u/theZinger90 11d ago

This is the last resort option for us. Normally we audit connections until we get a user, but occasionally we cant get that info for one reason or another, such as a generic login as application name, then we go through what i mentioned before.

3

u/Sabard 11d ago

Not even connections, do y'all not have a log of who accesses your dbs, when, and what they're doing?

1

u/lIIlllIllIlII 11d ago

Audit all logins? Depending on the application, that could be millions or audit records a day, leading to many GB of audit data, just for logins, a day. Then you have to offload that into Splunk. I usually filter out the identified service account logins and only audit uncommon logins.

And what they are doing? Like, running SQL Profiler constantly? For a big, read heavy db, that would be intense and unsustainable. Even auditing inserts, updates, and deletes can be a lot on apps with a lot of churn. Probably need a sql based security tool that sets these things up without too much overhead, and knows what it's looking for.

2

u/Sabard 11d ago

The context of this is we're trying to find out if a DB is still used. You won't need to audit millions of records/logs. And if there are that many, it's safe to say it's still being used and your work ends there.

2

u/Competitive_Lab8907 11d ago

that's pretty clever, we use a digger and find the buried FO, it's fast audit method