Sign in to follow this  
CoinLab

Coinlab Faq: How To Earn Silver Coins With Your Graphics Card

Recommended Posts

Where did you find a version number?

The version number will be displayed in the Settings dialog in an upcoming release. It's not exposed now.

Share this post


Link to post
Share on other sites

This isn't working for me. 100% GPU usage, est. rate 0.0/hr.

GTX 560Ti, driver 285.62

9vtdw.png

Seems your temperature is too high... when I was running SETI with CUDA, the packs would fail if I ran over 70c. Also, you are going to kill your GPU like this...

Share this post


Link to post
Share on other sites

For some reason the client is not starting for me. It gives the following error in the miner.exe.log:

Traceback (most recent call last):

File "miner.py", line 9, in <module>

File "Tkinter.pyc", line 38, in <module>

File "FixTk.pyc", line 60, in <module>

File "os.pyc", line 420, in __setitem__

UnicodeEncodeError: 'ascii' codec can't encode character u'\xf3' in position 10: ordinal not in range(128)

I'm using Windows 7 64-bit with Nvidia Geforce GTX 460 1GB

Driver version: 301.42

Share this post


Link to post
Share on other sites

Ran it while playing wurm and came up with some positive and negative effects.

Negative:

*3.8 times more expensive coins than buying from directly the shop.

*100% GPU load compared to 15% = Noisy fan

*Contributing to global warming.

Positive:

*Got to use some of my math skills.

Share this post


Link to post
Share on other sites
*Contributing to global warming

Run it during the winter! :D

Share this post


Link to post
Share on other sites

Seems your temperature is too high... when I was running SETI with CUDA, the packs would fail if I ran over 70c. Also, you are going to kill your GPU like this...

Yes, the card gets pretty hot. Poorly ventilated case combined with the overclock, I guess. Been using for the card for about 1 year now, not dead yet. :P

Share this post


Link to post
Share on other sites

How much processing power would coinlab be pulling from my gpus when my system is idle?

I ask because in researching the efficacy of trying this, I couldn't find a way to justify it for myself... and I so very much wanted to :P

At idle, a 560ti pulls about 115W according to some testing done at Anandtech. At full load, it's pulling 330W. Difference of 215W. I have two 560's so total difference of 430W.

The PSU is pulling power from the wall as the computer needs it, so the difference would be in effect. I have a good PSU so percentage efficiency shouldn't be too bad of an increase.

Assuming Coinlab would be pushing my processors to full load, accounting for local electricity rates and using Rolf's 120i as base reference, buying silver directly from Wurm is cheaper than using Coinlab for me in the long run. I could try to add in cost of dollar to euro conversion, but it just seems like for higher end cards, it adds up to more than a light bulb. Lower end cards may draw less power but also return less coin... guess I'm kind of stuck.

Lastly, I've been researching what the processing power I would be contributing is actually being used for, which led me to hash blocks for bitcoin but I have to admit, it was rather over my head.

It might be helpful in the future to add a bit more explanation about this on the coinlab site itself for anyone concerned as to what they are enabling by providing their processing power.

Thanks! Best of luck in your venture.

Share this post


Link to post
Share on other sites

How much processing power would coinlab be pulling from my gpus when my system is idle?


/>https://en.bitcoin.it/wiki/Mining_hardware_comparison Here's a comparison chart. 560Ti would do like 60-80 mhashes/s or something.

Share this post


Link to post
Share on other sites

How much processing power would coinlab be pulling from my gpus when my system is idle?

I ask because in researching the efficacy of trying this, I couldn't find a way to justify it for myself... and I so very much wanted to :P

At idle, a 560ti pulls about 115W according to some testing done at Anandtech. At full load, it's pulling 330W. Difference of 215W. I have two 560's so total difference of 430W.

The PSU is pulling power from the wall as the computer needs it, so the difference would be in effect. I have a good PSU so percentage efficiency shouldn't be too bad of an increase.

Assuming Coinlab would be pushing my processors to full load, accounting for local electricity rates and using Rolf's 120i as base reference, buying silver directly from Wurm is cheaper than using Coinlab for me in the long run. I could try to add in cost of dollar to euro conversion, but it just seems like for higher end cards, it adds up to more than a light bulb. Lower end cards may draw less power but also return less coin... guess I'm kind of stuck.

Is this where you are pulling your numbers from?

http://www.anandtech...e-250-market/16

The additional consumption from idle to full load is 196W (Furmark). You have two of them, so total additional power draw is 392W.

Your cards can generate about 70 iron per hour each, for a total of 140 iron / hour.

140 iron = 0.014 EUR = 0.0174 USD/hr

This means it is profitable if you pay $0.044 per kilowatt hour or less. Your card is not optimal for the computation we are performing, so you may wish to purchase coins from Rolf directly if you don't have cheap electricity.

EDIT: Using Vorg's power consumption data, the break even point for the GTX560Ti is $0.078 per kilowatt hour.

Let's take the HD 6970 as another example:

Additional power draw from full load - 199W

It can generate 300+ iron per hour

300 iron= 0.03 EUR = 0.0373 USD

For a HD 6970, if you pay less than $0.18 you are paying less through our service than directly purchasing coins.

AMD graphics cards tend to be much more efficient for the computations we perform.

Lastly, I've been researching what the processing power I would be contributing is actually being used for, which led me to hash blocks for bitcoin but I have to admit, it was rather over my head.

It might be helpful in the future to add a bit more explanation about this on the coinlab site itself for anyone concerned as to what they are enabling by providing their processing power.

Thanks! Best of luck in your venture.

Here's a whitepaper we at CoinLab wrote about Bitcoin in January: http://coinlab.com/p...coin-primer.pdf.

In short, the computation you are doing is trying random numbers millions of times a second in a very unpredictable equation. When the result (ranges from 0 - 4 billion) is small enough (ex. less than 100), it is broadcast to the Bitcoin network and it creates new bitcoins. The threshold of what is small enough adjusts to the power of the network over time, so bitcoins are generated at a fairly constant rate. Bitcoin uses this difficult computation because it is an "honest indicator" of work - you cannot fake it, the only way to get the answer is to actually perform the computation. This prevents people from being able to make "counterfeit" bitcoins.

Edited by CoinLab
  • Like 1

Share this post


Link to post
Share on other sites

Just saw the answer as I was responding to Rotab :)

thanks! reading...

Thank you so much! Unfortunately my rates do appear a bit too high at the moment but... must, build, windmill :P

Interesting to see a real-life example of what I've been reading about AMD vs. Nvidia regarding mathematical computations.

Thanks again for helping me make better sense of it all.

Always exciting to see new innovations come to fruition. Best.

Edited by Reylaark

Share this post


Link to post
Share on other sites

How much processing power would coinlab be pulling from my gpus when my system is idle?

I ask because in researching the efficacy of trying this, I couldn't find a way to justify it for myself... and I so very much wanted to :P

At idle, a 560ti pulls about 115W according to some testing done at Anandtech. At full load, it's pulling 330W. Difference of 215W. I have two 560's so total difference of 430W.

The PSU is pulling power from the wall as the computer needs it, so the difference would be in effect. I have a good PSU so percentage efficiency shouldn't be too bad of an increase.

I have a GTX560Ti and an Eaton 2000VA rack mount ups which can display watts load. I started/stoped the client several times and the change in load was 112w each time.

  • Like 1

Share this post


Link to post
Share on other sites

I have a GTX560Ti and an Eaton 2000VA rack mount ups which can display watts load. I started/stoped the client several times and the change in load was 112w each time.

Cool!

Using Vorg's power consumption, the break even point for the GTX560Ti is $0.078 per kilowatt hour.

Share this post


Link to post
Share on other sites

I cannot figure out the charts.

I am running one ATI Radeon hd 5700 series, 1gb

seems to produce around 150i/h (although for a few minutes it said 4300i/h)

I feel the hot air blowing out of my desk and its making me sweat.

Also ran it selecting the cpu. phenom II X4 955. earned 10i/h

Edited by Rimshot

Share this post


Link to post
Share on other sites

I cannot figure out the charts.

I am running one ATI Radeon hd 5700 series, 1gb

seems to produce around 150i/h (although for a few minutes it said 4300i/h)

I feel the hot air blowing out of my desk and its making me sweat.

Also ran it selecting the cpu. phenom II X4 955. earned 10i/h

Can you be more specific? The hashrate differs a lot between the different cards.

I seem to be getting about 1.1 iron per hour for every mhash/s.

So if you're making 150i/h I would guess you're doing about 165 mhashes/s. Which looks standard for a high-end 5750 or low-end 5770

Edited by Rotab

Share this post


Link to post
Share on other sites

I think its the low end 5770. Im not sure how to tell.

It is 850Mhz core clock speed. 1200mhz memory clock speed.

I am unable to find and system Info that tells me exactly what it is.

Also my power costs 12.638¢ per kilowatt hour. my pc is always on anyway. is it worth it?

Edited by Rimshot

Share this post


Link to post
Share on other sites

Also the email address in the verification email fails to send. trying to send a message

Delivery Status Notification (Failure)

support@coinlab.com

Whoops! Nice catch. Wurm support emails should be sent to wurmsupport@coinlab.com

EDIT: added the alias, now you can use either.

Edited by CoinLab

Share this post


Link to post
Share on other sites

I think its the low end 5770. Im not sure how to tell.

It is 850Mhz core clock speed. 1200mhz memory clock speed.

I am unable to find and system Info that tells me exactly what it is.

Also my power costs 12.638¢ per kilowatt hour. my pc is always on anyway. is it worth it?

If you have a 5770, it consumes around 108W while computing.

You generate 150 iron= 0.015 EUR = 0.01875 USD per hour

You should earn about $0.17361 per kilowatt hour, so you can generate almost 50% more coins than you could spending the same amount purchasing them directly.

If you can measure your own power consumption I can give you a more accurate calculation.

Share this post


Link to post
Share on other sites

Not to derail, doing some quick math I'm seeing under good conditions, your example of 150 iron per hour using 108 w that it is 50% more coins than buying it from the shop.

I used 12c per kh hour which would be .012ish using 108 w, netting me a gain of .00675 per hour, if I ran it for 24 hours I'm looking at 1.62 cents, what am I missing here? with one gold worth 65E I'm seeing the savings from buying from the shop, but not 50%, and nowhere near its real world value currently.

Share this post


Link to post
Share on other sites

Not to derail, doing some quick math I'm seeing under good conditions, your example of 150 iron per hour using 108 w that it is 50% more coins than buying it from the shop.

I used 12c per kh hour which would be .012ish using 108 w, netting me a gain of .00675 per hour, if I ran it for 24 hours I'm looking at 1.62 cents, what am I missing here? with one gold worth 65E I'm seeing the savings from buying from the shop, but not 50%, and nowhere near its real world value currently.

In-game Silver is valued at 1 EUR. Where do you get 1 gold being worth 65 EUR? (I am assuming the E in 65E is Euros?)

Share this post


Link to post
Share on other sites

65e is standard price to buy from another player. Some pay more some pay less but thats about average. Ive heard of as low as 50e I've paid as much as 75e. (And I paid the 75 cuz it was my first time and it was from a CA so I knew I wouldnt get scammed. Considered it a tip and a thanks.)

Edited by Schwanke

Share this post


Link to post
Share on other sites

Not to derail, doing some quick math I'm seeing under good conditions, your example of 150 iron per hour using 108 w that it is 50% more coins than buying it from the shop.

I used 12c per kh hour which would be .012ish using 108 w, netting me a gain of .00675 per hour, if I ran it for 24 hours I'm looking at 1.62 cents, what am I missing here? with one gold worth 65E I'm seeing the savings from buying from the shop, but not 50%, and nowhere near its real world value currently.

That is awsome. thanks alot for calculating that for me.

Share this post


Link to post
Share on other sites

Another thing that should probably be accounted for in your profit calculation is cooling cost. Running a compressor is not cheap, especially in the middle of summer. (It's 104F here at the moment).

In the middle of winter, however, when your GPU is pulling double duty as a processor AND a spaceheater, thats a different story though. :P

  • Like 1

Share this post


Link to post
Share on other sites

I should have mentioned the specs for my 560, It's EVGA with Nvidia stock clock rates, 1024GB GDDR5 memory. Evga made 2 versions of this card and the other is clocked slightly faster. Higher clocked cards of corse pull more power and teh amount/type of ram changes the power usage. So if your are close in cost, check how much power your card uses.

Also, check the power usage after the card has been running at load for at least 30 seconds. Some cards turn the fan speed up with load, others might hold back until the heat comes up.

And those in hot climate areas that use AC will need to notice if their AC runs more when the card is kept at full all the time. Because the AC gets that extra heat out of the house. Swamp coolers would not be effected.

Winter time, that heat can got against your heating bills :)

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.
Sign in to follow this