<p dir="ltr">If you wish, but I was thinking hopes and dreams.</p>
<div class="gmail_quote">On Dec 15, 2013 7:24 PM, "WebDawg" <<a href="mailto:webdawg@gmail.com">webdawg@gmail.com</a>> wrote:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Can we make them heavier?<br>
<br>
Possibly Sulfur Hexafluoride?<br>
<br>
<br>
<br>
On Sun, Dec 15, 2013 at 3:20 PM, Barbara Attilio<br>
<<a href="mailto:barbara.attili@gmail.com">barbara.attili@gmail.com</a>> wrote:<br>
> What , you think I'm made of money? Also, rainbow balloons. Maybe 3. One<br>
> half deflated out of spite.<br>
><br>
> On Dec 15, 2013 7:16 PM, "Erik Arendall" <<a href="mailto:earendall@gmail.com">earendall@gmail.com</a>> wrote:<br>
>><br>
>> 99 luftballoons?<br>
>><br>
>> On Dec 15, 2013 6:55 PM, "Barbara Attilio" <<a href="mailto:barbara.attili@gmail.com">barbara.attili@gmail.com</a>><br>
>> wrote:<br>
>>><br>
>>> I can bring balloons!<br>
>>><br>
>>> On Dec 15, 2013 6:53 PM, "<a href="mailto:enabrintain@yahoo.com">enabrintain@yahoo.com</a>" <<a href="mailto:enabrintain@yahoo.com">enabrintain@yahoo.com</a>><br>
>>> wrote:<br>
>>>><br>
>>>> I want a cookie cake!<br>
>>>><br>
>>>> Sent from my Verizon Wireless 4G LTE DROID<br>
>>>><br>
>>>><br>
>>>> Justin Richards <<a href="mailto:ratmandu@gmail.com">ratmandu@gmail.com</a>> wrote:<br>
>>>><br>
>>>> Woohoo!<br>
>>>><br>
>>>> On Dec 15, 2013 6:48 PM, "Kyle Centers" <<a href="mailto:kylecenters@gmail.com">kylecenters@gmail.com</a>> wrote:<br>
>>>>><br>
>>>>> Jeff Cotten says if this thread gets to 50 messages, he'll throw a<br>
>>>>> party. So. This is my contribution?<br>
>>>>><br>
>>>>> On Dec 15, 2013 5:37 PM, "James Fluhler" <<a href="mailto:j.fluhler@gmail.com">j.fluhler@gmail.com</a>> wrote:<br>
>>>>>><br>
>>>>>> Thanks for the link I will check it out!<br>
>>>>>><br>
>>>>>> James F.<br>
>>>>>><br>
>>>>>> On Dec 15, 2013, at 4:04 PM, Stephan Henning <<a href="mailto:shenning@gmail.com">shenning@gmail.com</a>><br>
>>>>>> wrote:<br>
>>>>>><br>
>>>>>> -WD<br>
>>>>>><br>
>>>>>> I'll check the arrays and see what they are currently formatted as,<br>
>>>>>> it's not a big deal to reformat one of these arrays, so that something that<br>
>>>>>> can be changed quick and easy.<br>
>>>>>><br>
>>>>>> Eh, I'm not involved in the development, but I'll bring it up and if<br>
>>>>>> it is something that hasn't been considered I'll put some pressure on them<br>
>>>>>> to look into it.<br>
>>>>>><br>
>>>>>><br>
>>>>>> -James<br>
>>>>>> <a href="http://www.ierustech.com/product/v-lox/" target="_blank">http://www.ierustech.com/product/v-lox/</a><br>
>>>>>><br>
>>>>>> It's internally built, just got rolled out to market.<br>
>>>>>><br>
>>>>>><br>
>>>>>><br>
>>>>>> On Sun, Dec 15, 2013 at 2:04 PM, James Fluhler <<a href="mailto:j.fluhler@gmail.com">j.fluhler@gmail.com</a>><br>
>>>>>> wrote:<br>
>>>>>>><br>
>>>>>>> I have not heard of VLOX before and a quick google search turned up<br>
>>>>>>> nothing? Is it commercially available or internally built? I've typically<br>
>>>>>>> used NEC, GEMS, EMDS, and Genesys, for eMag simulation work.<br>
>>>>>>><br>
>>>>>>> Just curious but where do you work haha<br>
>>>>>>><br>
>>>>>>> James F.<br>
>>>>>>><br>
>>>>>>> On Dec 13, 2013, at 11:13 AM, Stephan Henning <<a href="mailto:shenning@gmail.com">shenning@gmail.com</a>><br>
>>>>>>> wrote:<br>
>>>>>>><br>
>>>>>>> Method of Moment, Computational ElectroMagnertics.<br>
>>>>>>><br>
>>>>>>> Program is called Vlox<br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> On Fri, Dec 13, 2013 at 10:47 AM, David <<a href="mailto:ainut@knology.net">ainut@knology.net</a>> wrote:<br>
>>>>>>>><br>
>>>>>>>> MoM CEM vlox -- could you expand those acronyms, please? Is this a<br>
>>>>>>>> logistics planning tool?<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> Stephan Henning wrote:<br>
>>>>>>>><br>
>>>>>>>> -David<br>
>>>>>>>><br>
>>>>>>>> Hmm, sounds interesting. The problem is distributed a little<br>
>>>>>>>> currently, you can think of it kind of what is being done as a form of monte<br>
>>>>>>>> carlo, so the same run will get repeated many times with light parameter<br>
>>>>>>>> adjustments. Each of these can be distributed out to the compute nodes very<br>
>>>>>>>> easily, currently this is being done with condor.<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> -James<br>
>>>>>>>><br>
>>>>>>>> It's a MoM CEM tool called vlox.<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> On Fri, Dec 13, 2013 at 5:43 AM, James Fluhler <<a href="mailto:j.fluhler@gmail.com">j.fluhler@gmail.com</a>><br>
>>>>>>>> wrote:<br>
>>>>>>>>><br>
>>>>>>>>> I'm just curious what simulation program are you running? I've used<br>
>>>>>>>>> a number in the past that also utilize the GPU's for processing.<br>
>>>>>>>>><br>
>>>>>>>>> James F.<br>
>>>>>>>>><br>
>>>>>>>>> On Dec 12, 2013, at 11:28 PM, David <<a href="mailto:ainut@knology.net">ainut@knology.net</a>> wrote:<br>
>>>>>>>>><br>
>>>>>>>>> IIRC, the good thing about this cluster is the automagic load<br>
>>>>>>>>> leveling. Your existing binary may not run at max optimization but if the<br>
>>>>>>>>> task can be spread among processors, Beowulf does a nice job of it. If each<br>
>>>>>>>>> computer has it's own GPU(s), then all the better.<br>
>>>>>>>>><br>
>>>>>>>>> You can test it right there without changing anything on the<br>
>>>>>>>>> system's disks. Just create and run all the cluster members off a CD.<br>
>>>>>>>>><br>
>>>>>>>>> Then to test, pick the fastest one of them (maybe even your<br>
>>>>>>>>> existing Xeon box), run your benchmark, record execution time, then boot all<br>
>>>>>>>>> the other machines in the cluster and run it again. There are only about<br>
>>>>>>>>> two dozen steps to set it up. One professor even put most of those, along<br>
>>>>>>>>> with automatic cluster setup(!) as a downloadable you can boot off of. That<br>
>>>>>>>>> leaves half a dozen steps to tweak the cluster together, then you're good to<br>
>>>>>>>>> go. I have one of those CD's around here somewhere and I can get details if<br>
>>>>>>>>> you're interested. Something to play with. I did it with only 4 pc's<br>
>>>>>>>>> around the house with some code and even though the code was never designed<br>
>>>>>>>>> for a cluster (just multiprocessing), I got about 40% decrease in execution<br>
>>>>>>>>> time. The code was almost completely linear execution so I'm surprised it<br>
>>>>>>>>> got any improvement but it did.<br>
>>>>>>>>><br>
>>>>>>>>> David<br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> Stephan Henning wrote:<br>
>>>>>>>>><br>
>>>>>>>>> -WD<br>
>>>>>>>>><br>
>>>>>>>>> I believe it's either ext3 or ext4, I'd have to ssh in and check<br>
>>>>>>>>> when I get back on Monday.<br>
>>>>>>>>><br>
>>>>>>>>> -David<br>
>>>>>>>>><br>
>>>>>>>>> I'll check into the Beowulf and see what that would entail. I'll<br>
>>>>>>>>> try and talk with the developer and see what their thoughts are on the<br>
>>>>>>>>> feasibility of running it on a cluster. They may have already gone down this<br>
>>>>>>>>> path and rejected it, but I'll check anyway.<br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> On Thu, Dec 12, 2013 at 6:16 PM, David <<a href="mailto:ainut@knology.net">ainut@knology.net</a>> wrote:<br>
>>>>>>>>>><br>
>>>>>>>>>> Sounds like a perfect candidate for a Beowulf cluster to me.<br>
>>>>>>>>>> There are possibly some gotcha's but you'll have the same problems with just<br>
>>>>>>>>>> a single computer.<br>
>>>>>>>>>><br>
>>>>>>>>>> Velly intewesting.<br>
>>>>>>>>>><br>
>>>>>>>>>> Stephan Henning wrote:<br>
>>>>>>>>>>><br>
>>>>>>>>>>> -WD<br>
>>>>>>>>>>><br>
>>>>>>>>>>> The GPUs are sent data in chunks that they then process and<br>
>>>>>>>>>>> return. The time it takes a GPU to process a chunk can vary, so I assume the<br>
>>>>>>>>>>> bottle necks we were seeing was when several of the GPU cores would finish<br>
>>>>>>>>>>> at about the same time and request a new chunk and the chunk they needed<br>
>>>>>>>>>>> wasn't already in RAM, so the drive array would take a heavy hit.<br>
>>>>>>>>>>><br>
>>>>>>>>>>> Beyond that, I can't really give you a numerical value as to the<br>
>>>>>>>>>>> amount of data they are dumping into the pcie bus.<br>
>>>>>>>>>>><br>
>>>>>>>>>>><br>
>>>>>>>>>>> -David<br>
>>>>>>>>>>><br>
>>>>>>>>>>> Ya, not sure an FPGA exists large enough for this, it would be<br>
>>>>>>>>>>> interesting though.<br>
>>>>>>>>>>><br>
>>>>>>>>>>> While the process isn't entirely sequential, data previously<br>
>>>>>>>>>>> processed is reused in the processing of other data, so that has kept us<br>
>>>>>>>>>>> away from trying a cluster approach.<br>
>>>>>>>>>>><br>
>>>>>>>>>>> Depending on the problem, anywhere from minutes per iteration, to<br>
>>>>>>>>>>> weeks per iteration. The weeks long problems are sitting at about 3TB I<br>
>>>>>>>>>>> believe. We've only run benchmark problems on the SSDs up till now, so we<br>
>>>>>>>>>>> haven't had the experience of seeing how they react once they start really<br>
>>>>>>>>>>> getting full.<br>
>>>>>>>>>>><br>
>>>>>>>>>>> Sadly, 2TB of RAM would not be enough. I looked into this Dell<br>
>>>>>>>>>>> box<br>
>>>>>>>>>>> (<a href="http://www8.hp.com/us/en/products/proliant-servers/product-detail.html?oid=4231377#!tab=features" target="_blank">http://www8.hp.com/us/en/products/proliant-servers/product-detail.html?oid=4231377#!tab=features</a><br>
>>>>>>>>>>> <<a href="http://www8.hp.com/us/en/products/proliant-servers/product-detail.html?oid=4231377#%21tab=features" target="_blank">http://www8.hp.com/us/en/products/proliant-servers/product-detail.html?oid=4231377#%21tab=features</a>>)<br>
>>>>>>>>>>> that would take 4TB, but the costs were insane and it can't support enough<br>
>>>>>>>>>>> GPUs to actually do anything with the RAM...<br>
>>>>>>>>>>><br>
>>>>>>>>>>><br>
>>>>>>>>>>><br>
>>>>>>>>>> <<<snip>>><br>
>>>>>>>>>><br>
>>>>>>>>>><br>
>>>>>>>>>> _______________________________________________<br>
>>>>>>>>>> General mailing list<br>
>>>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> _______________________________________________<br>
>>>>>>>>> General mailing list<br>
>>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> _______________________________________________<br>
>>>>>>>>> General mailing list<br>
>>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>>>><br>
>>>>>>>>><br>
>>>>>>>>> _______________________________________________<br>
>>>>>>>>> General mailing list<br>
>>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> _______________________________________________<br>
>>>>>>>> General mailing list<br>
>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> _______________________________________________<br>
>>>>>>>> General mailing list<br>
>>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> _______________________________________________<br>
>>>>>>> General mailing list<br>
>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> _______________________________________________<br>
>>>>>>> General mailing list<br>
>>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>><br>
>>>>>><br>
>>>>>> _______________________________________________<br>
>>>>>> General mailing list<br>
>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>>><br>
>>>>>><br>
>>>>>> _______________________________________________<br>
>>>>>> General mailing list<br>
>>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>>><br>
>>>>><br>
>>>>> _______________________________________________<br>
>>>>> General mailing list<br>
>>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>>><br>
>>>><br>
>>>> _______________________________________________<br>
>>>> General mailing list<br>
>>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>>><br>
>>><br>
>>> _______________________________________________<br>
>>> General mailing list<br>
>>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
>><br>
>><br>
>> _______________________________________________<br>
>> General mailing list<br>
>> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
>> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
><br>
><br>
> _______________________________________________<br>
> General mailing list<br>
> <a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
> <a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
<br>
_______________________________________________<br>
General mailing list<br>
<a href="mailto:General@lists.makerslocal.org">General@lists.makerslocal.org</a><br>
<a href="http://lists.makerslocal.org/mailman/listinfo/general" target="_blank">http://lists.makerslocal.org/mailman/listinfo/general</a><br>
</blockquote></div>