Storage@Home

EvilAlchemist

2[H]4U
Joined
Jan 11, 2008
Messages
2,730
Interesting seminar....It will be interesting to see how it gets off the ground. One thing I noticed was when they dicussed only allowing one boxen / user in this project - at least that was how I interpreted it. Seems like that would prohibit a lot of potential gains by doing it that way, as opposed to assigning each computer a unique work ID, or maybe tie it into the MAC address of the NIC or something like that.

Thanks for the linkage...

 
One thing I noticed was when they dicussed only allowing one boxen / user in this project - at least that was how I interpreted it.

Yeah, I noticed that as well, but he does have a point.

If you have 5 boxen storing data, and they need bits from all five, it will really tax your internet speed.

Plus, home users upload speed is slower so kinda hard to pull mass date from one user.
Still, it is a great idea, and i hope they will get it running.

Be nice to have extra points added to F@H stats (If they do that- some mention of points was in there)

 
That is sexxxy... this better be allowed across boxes... I could put something like 18TB online quickly! They need to figure out how to scale this for users that have large bandwidth... Gah they never get it.. us with FARMS!!! have networks that can handle this!

 
The biggest hurdle would be the bandwidth quota IPS is putting. In Canada, there is no ISP with full unlimited download/upload anymore :(

 
The biggest hurdle would be the bandwidth quota IPS is putting. In Canada, there is no ISP with full unlimited download/upload anymore :(


How bad are the limits?

If this is true, the project will be dead before it ever gets off the ground.

 
Around here, we are talking about 30-50 GB per month max. Anything over this is extra and if it is grossly exceeded, you can lose the account. It's akin to Comcast doing anything to cut traffic from bittorrent.


 
How bad are the limits?

If this is true, the project will be dead before it ever gets off the ground.


You might be overestimating how many people there are in Canada vs. everywhere else in the world.


 
The biggest hurdle would be the bandwidth quota IPS is putting. In Canada, there is no ISP with full unlimited download/upload anymore :(


I believe that limit is only imposed on those who speak French;):p

 
The biggest hurdle would be the bandwidth quota IPS is putting. In Canada, there is no ISP with full unlimited download/upload anymore :(


MTS has unlimited upload and download usage. Plus no blocked ports.
 
I could probably do this. I have a 10/1mbps fibre to the home line...and I've been thinking about upping that to 3mbps on the upload, mainly for seeding arrrrchives. But this would be a better use.
 
This could be quite a good + cheap idea if your able to use say a 1.3mbps/512kb and get by with it then in reality all you have to worry about is space which is far cheaper then whole computers not to mention operating costs are much cheaper, and much easier to work with, ofc the standard issues would apply couldn't really use this playing a game especially an Online game, or downloading anything but i think it should catch on quite well and i look forward to the day this is launched.
 
The biggest hurdle would be the bandwidth quota IPS is putting. In Canada, there is no ISP with full unlimited download/upload anymore :(


I have DSL from AT&T and I've downloaded over 500Gb a month without seeing any reduction in speeds or any sort of caps :)

 
Storage@home?

you would have to delete your pr0n to make room for that boys :D
 
Storage@home?

you would have to delete your pr0n to make room for that boys :D

I have a media center and I'm set with 2 TB per computer soo I don't think I can fill it with that much porn... On the other hand I do have one drive filling up quickly :p LOL

 
I have a media center and I'm set with 2 TB per computer soo I don't think I can fill it with that much porn... On the other hand I do have one drive filling up quickly :p LOL


I have one with 1TB but if the conditions is met with success for me, I could buy 3 extra TB disks (Samsung Spinpoint F1) in RAID 5 and allocate space for S@H.

I think Stanford is award of the potential bandwidth issue and I'm sure they will implement mechanisms for everyone to control this. As example, my connection has a 30 GB quota per month but I rarely get over 7GB. I can tell the client I can allocate 5-10GB of bandwidth per month then they will adjust accordignly.

Another potential possibility is the ability to leave a backup of our own results files locally so Stanford could skip that step in their place. This is a great way to store a lot without using bandwidth then if they need to restore a file, just poll our box and retrieve it. This is a excellent way when we know that for each WU, there is always 5-8 copies floating around for redundancy and validation. They need just 1 valid copy for post-processing analysis so 80% of the storage space is just for backup incase and probably never needed.

There is a lot of thought which need to be taken care to make the project viable for everyone with broadband and to suit the needs.

 
There is also a paper about that : http://www.stanford.edu/~beberg/[email protected]

It's a very interesting read (I didn't read it all but I read the part about the storage challenge).


Hey brotherman (Xilikon), for us "attention challenged, patients challenged and over all everything challenged" (PC for "eat up with the dumb a*ses sometimes") would you please give us one of your abridged and easy to understand versions of exactly what this " Storage@Home" is. :rolleyes:

Unfortunately I had trouble reading that, I'm sure, wonderful and informative article on Storage@Home. (man, it's a long one :eek:and my attention span is not that wide :()

Thanks a bunch :p

 
Think of it as distributed storage with each person donating up to 10Gb of Harddrive space... with over 100,000 users that is over 300 TB of storage alone. This would allow Stanford to increase their redudancy of data and save them money while allowing you to earn extra points for the storage.

 
very cool but it wont work with peoples upload speeds :(

That is the cool thing about so many users hosting files.

Lets say they need 100MB of data for a certain project / paper.

If 50 users have the files .. that is 2MB to upload ... that is not that much at all.

Just have to wait and see how the program goes.
 
Somehow I feel this won't be as graceful as F@H is now. My upload is much slower than my download. If they wanted to store and access any decent amount of data on my computer my internet would choke (It's really sad, about 80-100kb/s upload max).

Also, keep in mind that Stanford would need massive redundancy to pull this off, so that example with 100MB/50 computers = 2MB is probably wrong. I heard him say something like about 4 computers but I doubt they'd want their data missing if all four disappeared. With only four, it probably won't be that uncommon...
 
Somehow I feel this won't be as graceful as F@H is now. My upload is much slower than my download. If they wanted to store and access any decent amount of data on my computer my internet would choke (It's really sad, about 80-100kb/s upload max).

There's quite a few of us in the same boat, with poor upload speed. But, look at BitTorrent. With enough peers, I can get the latest Ubuntu CD in an acceptable time frame, despite the fact that many of those peers are uploading at a trickle. And with a system with redundancy and some fault-tolerance designed in from the get-go, it shouldn't be all that painful to most users. We'll just have to see how much access to these files Stanford will need in practice.
 
I forgot the BT analogy and this is probably a avenue Stanford is looking for their distributed storage. Maybe a package is split in small 1-5 MB bits and spread across some hundreds. Like that, if they need something, they can download fast even if each have capped upload speeds. This also help with redundancy since it may be split then duplicated 4-8 times to account for the potential down machines when they need data. I also expect them to have a kind of system to monitor availability and if a few boxes is down for X days, data is replicated on new boxes in case.

All this might look complicated but with a good planning and the sheer amount of computers, it could work.

 
Storage@home?

you would have to delete your pr0n to make room for that boys :D

I don't have even a single byte of pr0n on any of my machines. :p I just go for the real thing and leave it at that. Besides, what does the manw[H]ore need with pr0n?

 
very cool but it wont work with peoples upload speeds :(

640KB uploads here.. I think that is plenty fast. They are talking about using as mainly as archives not instant harddrive access. I assume they would set some limits.. most likely only broadband woudl be allowed... dial-up = slow and probably not acceptable.
 
I don't have even a single byte of pr0n on any of my machines. :p I just go for the real thing and leave it at that. Besides, what does the manw[H]ore need with pr0n?


Hell is it wrong that I like plenty of both? Real and pr0n?

 
I really think they should consider using more than 10GB for us with huge farms. I mean especially if we have demonstrated that we are stable and reliable.

 
I think the main idea is that they do not want to rely on any one person too much though. A reliable farm is one thing, but if you have lots of information they need, and no electricity, your farm is of less worth to them suddenly.

Now if your borgs are spread around other places that would be better, but then that would make borging a bit more difficult to swallow for some people.

If they let people have more than one computer on this, it would help with redundancy. That would certainly help.

I just contradicted myself. I am my own devils advocate.
 
I really think they should consider using more than 10GB for us with huge farms. I mean especially if we have demonstrated that we are stable and reliable.


Exactly. Although the reasons stated are legit, storage is dirt cheap these days.


 
Back
Top