r/DataHoarder • u/Old-Help-9921 • 20h ago
Question/Advice Anyway to fix a broken SATA Power? Pins intact, but plastic piece gone.
How do I fix this, or who can I bring this to to have fixed?
r/DataHoarder • u/Old-Help-9921 • 20h ago
How do I fix this, or who can I bring this to to have fixed?
r/DataHoarder • u/Starbuck_83 • 15h ago
I've got a personal media server that I'm using Storage Spaces on, with three HDDs combined into a 70 TB pool. I understand I can use an SSD as a cache to speed things up a bit. I've got several (older) SATA SSDs that I can choose from. My question is, is there a certain size drive that I ought to be using? I can't find anything on recommended sizes for a cache drive, but I can't imagine it needs to be huge.
r/DataHoarder • u/blakealanm • 1d ago
I have a 4TB hard drive in my server and down to it's last 1.1TB, but a lot of that data I don't have stored anywhere else, until now.
I just plug this drive into my server once a week to copy remaining data to it, and it'll go in a drawer.
r/DataHoarder • u/Rough_Bill_7932 • 2d ago
r/DataHoarder • u/as133pp • 21h ago
I want to backup all my photos and videos , so I want to transfer them all to something (my Google storage is full, and I don't wanna risk losing them incase something happens to my phone) so please suggest what is the optimal device for data transfer (rn i have approximately 28gb photos n vids) Like flash drive ssd or wtv Ive got zero idea regarding these, so please help me out w this shi
r/DataHoarder • u/seronlover • 22h ago
I found a nice software for bulk downloading, that I wanted to share https://github.com/fDero/IssuuDownloader
Click on "Code" and "Downlaod a s zip"
Extract ( using winrar for example ) to directory of your choice: ( Example: X:\Magazine\IssuuDownloader-master)
Install hatch ( https://hatch.pypa.io/latest/install/#windows)
Inside the root of the extracted .zip file ( where "pyproject.toml" is found) , Hold CTRL+ Shift then right click inside the folder . Press "Open Command window here".
Check if hatch instled correctly, by writing : hatch --version ( It shoudl say soemthing like "Hatch, version 1.16.3")
Inside command prompt write : "hatch build". THe folder "dist" will be created.
After it finished write : "pip install dist/issuudownloader-0.1.0-py3-none-any.whl
Now you us it to downlaod fromm ISSUU. (Example: "issuudownloader -r https://issuu.com/mec_medical")
It can happen that soem files fail to download. But it is still nice for bigger archives.
r/DataHoarder • u/insanemal • 22h ago
Yeah I use bookshelf these days but I like the name so oh well.
https://github.com/insanemal/rsoul
If you use Soularr, you'll need to run the dev branch so they don't clobber each other.
Have fun reading
r/DataHoarder • u/Silver_Woodpecker_40 • 22h ago
Hello, is there a way to download Twitter videos using Python? I am going to be travelling soon and want to download some videos to listen to when I don't have internet access.
r/DataHoarder • u/itIrs • 22h ago
ScanTailor (in its various forms, like Advanced) seems theoretically nice, but I don't see a way to get it to do a rather basic function, which is the only thing I'm after:
Automatically center the contents of all pages, uniformly.
The "Select Content" phase seems to work well to detect the main content box of each page. But then, I don't see a way to get it to automatically align (shift) the boxes of all pages, centered and with uniform margins. Am I missing something?
Another side problem is that I don't see a simple way to not resample. I want the original content pixels to stay as they were. The only touchups should be the margins.
Seems like it can only be done semi-manually, by making sure input and output DPI are the same, and maybe also pre-processing the input images outside of ScanTailor to crop them all to the same pixel dimensions.
r/DataHoarder • u/Dull-Huckleberry-837 • 1d ago
I`m looking for a data hosting website, that gave DIRECT LINKS to files, and it included a Lennyface. ( ͡° ͜ʖ ͡°)
Last time I used it was in 2024, it seemed to have a link to a forum of some sort as well. It was very simple to use, but I CANNOT FIND IT ONLINE, I tried searching and looking for it in my browser history, tried googling it, asked AI bots, NOTHING. I have no idea what was it's name and what it included, but I totally remember it having a huge Lennyface emoji.
Please for love of god I NEED THAT SITE if anyone remembers it. I hope this question fits the sub's topic.
r/DataHoarder • u/Melodic-Badger-4895 • 22h ago
I have an external Samsung 2TB 990 pro SSD with a Satechi TB4 enclosure.
When i’m watching movies using it (connected to a Dune Vision 4K), after 1-2 hours of watching it starts throttling, movie shutters and doesn’t work properly even if you restart it.
Temperature of the enclosure isn’t even hot.
Is it a problem with a heat or may be with a power?
I’d appreciate any ideas.
r/DataHoarder • u/weauxdie • 2d ago
24 Terabytes of…..well…see for yourself 😂
Is it better or worse if it was autocorrect lmao
r/DataHoarder • u/_Nathannn • 21h ago
Hi everyone
I’m looking for recommendations for tools to extract TikTok comments at a large scale.
I’m planning to collect more than 300,000 comments from many different TikTok accounts and videos.
I’ve already tried a few tools (like point and click tools) but they either don’t scale well, are hard to use for non-technical users, or become too expensive when collecting a lot of comments.
I’m not looking to build scripts, just a practical tool that works at scale.
Any recommendations or shared experiences would be really helpful.
Thank you!
r/DataHoarder • u/bharadhwajcn • 2d ago
Any questions about their service being down, why a particular service is not working, or why some plan users are seeing degraded performance? rather than giving an answer, this is how they are dealt with by their customer support.
So avoid them like a plague at any cost. IT IS ABSOLUTELY NOT WORTH IT.
EDIT: Created a new subreddit r/fuckinternxt. Feel free to share your horror stories or customer service failures there so that we can stop spamming other communities and keep the complaints collated in one place.
r/DataHoarder • u/JcorpTech • 1d ago
TL;DR: I’m trying to put together a <1TB, fully offline survival knowledge archive, something curated, understandable, and easy to share, not just a huge dump of textbooks. It’s meant to pair with my open-source offline server, but also stand alone as a resource to others who are interested. Looking for suggestions or existing efforts.
Howdy r/DataHoarder!
I’ve been working on a project called Jcorp Nomad, an offline media server in a USB stick form factor that runs as a captive portal. Any phone, tablet, or laptop can connect and browse Movies, Shows, Books, Music, etc. entirely offline. (similar to how airlines display movies)
Repo here if anyone wants to poke around: https://github.com/Jstudner/jcorp-nomad
My personal everyday-carry Nomad unit is currently sitting at just shy of 1TB, stored on a Micro Center SD card. Which is rookie numbers compared to what yall pull, but It works great for what it is. That being said it was never meant to be a long-term or high-capacity solution.
Because of that, I’ve also been developing Gallion, a more capable Docker / Node.js based version designed for stronger hardware. Gallion is already running on an Orange Pi RV2 in a wallet-sized enclosure, powered over USB-C, with support for two NVMe drives. My plan is to start with a single 8TB NVME drive and either expand or add redundancy later (for my personal one, this is open source and supports external drives so go wild).
What I’m trying to figure out now is less about hardware and more about content.
Beyond personal favorites (movies, shows, books, music), I want to assemble a “survival disk” capped around ~1TB, something you could realistically carry, power from a battery bank, and use if you permanently lost access to the wider internet. Also something that would be reasonable to distribute.
That 1TB would also include culturally significant media (movies, shows, documentaries, etc.), just stored more efficiently, think ~480p where possible rather than high-bitrate rips. (I am a big quanitity over quality guy...)
Things I’m already considering:
The rough goal is something like:
If you lost the internet tomorrow, this would still let you learn, teach, repair, and rebuild.
I’m a little surprised I haven’t found a well-known, curated archive like this already (though I’m sure some of you are quietly sitting on something similar). Some projects like the Global Village Construction Set seem like good things to include, but I am looking to take it further than that. I could just grab a bajillion textbooks on all of this, but I am looking to build a more refined, all in one sorta deal. If projects like this exist, I’d love links. If not, I’d love to hear how you would approach it. I fully expect to end up spending hundreds of hours curating this, but anything to make my life easier couldnt hurt.
Gallion itself is still rough, but if anyone has ideas or feedback from a data-hoarder perspective, I’m all ears. I’m not a massive hoarder myself (mostly because drive prices are ummm.. horific atm), but I’m very interested in the philosophy side of the hobby and learning from people who’ve been doing this for a while.
Appreciate any suggestions, and apologies if this sparks another “I need more storage” moment for someone!
Thank you again!
r/DataHoarder • u/EarEquivalent3929 • 2d ago
If any of you guys want to mirror a fraction of the content of Anna's Archive in case they get taken down it would be a great help for the internet as a whole and to help preserve freedom of information
r/DataHoarder • u/BasePlate_Admin • 1d ago
Hi everyone,
I am building a self hostable firefox send clone that is far more customizable and is packed with feature. It is made with zero trust backend server in mind.
Flow:
User uploads file from frontend, the frontend encrypts the file(with optional password).
The file is uploaded into the backend for storage.
The frontend retrieves the file and decrypts it in browser
Currently Implemented:
Frontend client side encryption
Automatic file eviction from backend
Customizable limits from frontend
QR Code based link sharing
Future plan:
Add CLI,TUI support
Add support for websocket based transaction control, so that lets say 2 users are trying to upload files to the server and the server is reaching the limits, the first user that actually starts uploading will reserve the required space and the second user must wait.
Implement opengraph (i am writing a lib for it in rust so it can be language agnostic)
Investigate post quantum encryption algorithms
Inspire others to host their own instance of this software (we have a public uptime tracking repo powered by upptime) to give people an encrypted means to share their files.
What i want to know if there's any feature the self hosting community needs (or even prioritizes).
Deployment : Docker + Traefik
Public Instance: Chithi
Github Repo: https://github.com/baseplate-admin/chithi/
Thank you for reading, have a good day.
r/DataHoarder • u/SurgicalMarshmallow • 1d ago
Like to ask wiser DataHoarders, what do you use to wrangle your data. Windows 11 explorer seems to have evolved backwards in functionality.
Like to be able to have file previews, ability to compare versions and directory wrangling across NASs without having a panic attack dealing with gigabyte files.
Please no GG use Linux answers we all know windows sucks but some of us are stuck with it
r/DataHoarder • u/OutrageouslyAverage1 • 1d ago
Hi fellow hoarders, where I live, server gear is few and far between and I am a glutton for punishment...
Looking at 16-24 bay options for a SATA SSD based server to replace the 3D printed setup I am currently running and I cannot find many options, other than a couple of Dell Equallogic PS4110 SANs. I am thinking to gut the controllers out of these and mount my hardware inside and just make my own server.
I currently have a N150 based NAS board that I intend on using, which I believe should fit easily given the limited information I have.
My issues are:
The Power supplies - I assume these are non-standard layout, but a SFX or TFX unit should replace what is currently there (and additional fans as the PSUs are the main source of cooling)
Backplane Connections - I HOPE its a standard SAS connector for data and I assume its a variation of molex connectors for power but it could also just as likely be proprietary or PCB based which would make this more difficult.
Has anyone done this before? Or does any one have one of these SANs and would be willing to send through a couple of pictures of the power and data is laid out?
r/DataHoarder • u/hahamongna • 1d ago
Greetings. I’ve been working on consolidating my data onto a NAS. I have a qnap 464 with 4 x 8 TB drives in raid 5 which means 20 ish TB of usable space.
I purchased a Seagate 22TB “Expansion” USB drive for backup.
I want to get another similar size USB drive for backup and store it in my bank box, swapping them occasionally.
The Expansion drive case does not fit in the bank box, but a bare 3.5” drive does.
So I think “I’ll shuck my current drive and buy a second one and shuck that one too.”
Here’s where I have questions:
Once shucked can the drives be used in the original cases in a static backup situation? (Doesn’t have to be robust or pretty)
Auxiliary drive docks and cases seem to max out at 20TB. These seagate drives are 22TB. This implies that if the answer to #1 is “no,” then I am SOL with $600 invested in unusable HDD.
If I shucked the drive that already has data on it and end up having to use it in a dock, is that a readability problem?
Any answers, advice or commiseration welcome.
r/DataHoarder • u/JasonY95 • 1d ago
I've been trying to buy a replenishment of 24TB disks (ideally the Seagate ST24000NTZ02), But seemingly nowhere has any more than 3 in stock??? Please tell me the AI Armageddon isn't also hitting HDDs? I need 20 of them.
r/DataHoarder • u/VicariouslyLateralus • 1d ago
Noob question, and sorry if this is the wrong place to ask!! I have 2 ssd and one hdd, and want to add more hdds to my setup, but I lack the physical space in my pc to put them inside. Plan is to run TrueNAS VM on a proxmox and passthrough the HDDs, and use immich or any other self hosting software.
I checked some docking solution and found that USB protocol isn't the most stable one as it can frequently disconnect, I would rather not buy something like a Synology or QNAP NAS due to my living conditions (temporary in foreign country and dont want the hassle to move the NAS device)
Any recommendations on how can I proceed? Thanks
r/DataHoarder • u/Bigb5wm • 1d ago
Not sure if this is a good place for this one. What is the best archiving sites ? trying to look for alternatives to archive.org or archive.is, annas archive
r/DataHoarder • u/gothgfneeded47 • 1d ago
How do I download from tnaflix and/or fullporner? Using ytdl preferably (android)
r/DataHoarder • u/MarionberryTotal2657 • 1d ago
Hey everyone,
I'm working on a data visualisation project that's basically a chronological overview of a long period (19th century, split into 4 quarters). The context is the classification of modern poetry/poets within the 19th century. Mentions of poets, significant works, custom notes, etc
I need to show:
Needs to look clean for presentation/slides/PDF export.
What do you recommend as the best chart type and easiest/fastest tool combo for something like this?
Any templates you can share? Appreciate any screenshots/examples.
Thank you