Well, first of all, the big data used by researches must be given from hand to hand (this ad-hoc is the worst solution 🙂 ) to enable cooperation, teamwork etc.
Secondly, research financial grants often requires that data and code must be open-source and available freely.
Cloud computing of course is not always possible, and that’s the reason of creating this post. Researches use vast majority of statistical and data-mining tools which operate on local resources or at most remote databases (i.e. RapidMiner, R).
Dropbox and public folder – especially good solution if you have gained a lot of extra space because of limited promotions (i.e. after buying a Samsung smartphone). I had around 100GB in a peak moment and was called a Dropbox Guru twice. After putting data into “Public” folder, you have there a thing in right-click-menu called” copy public link” which allows to download data anywhere from the world.
Amazon AWS cloud – popular within companies, but solution payable depending on the transfer used
Google Cloud Storage – info at Google Developers –
MS Azure Cloud Services – info on Windows Azure –
Amazon Personal cloud –
Google Drive –
Dedicated machine with cloud software – i.e. Synology NAS drives
Reasonable (only in case of lack of money) seems to me to create couple of cloud accounts and use them to put data after after separating it into multiple parts (depending on the size they have).