Sign in to follow this  
CoinLab

Coinlab Faq: How To Earn Silver Coins With Your Graphics Card

Recommended Posts

Nvidia sucks at mining because they have good designs, while AMD is good at it because they make up for worse designs by adding many parallel shaders (hence making them good at brute forcing things).

 

https://en.bitcoin.it/wiki/Mining_hardware_comparison

 

Interesting since you must have the OpenCL to run it. OpenCL is an NVidia interface for the CUDA cores. So how does this work for AMD ? I have an AMD Radeon HD 6450 sitting here. I tried to install openCL for AMD and it didn't work. Told me I needed an NVidia card to install this. Won't install without CUDA cores.  Any ideas ? And how much faster does it mine with your AMD ? And what AMD card do you have ?

Edited by JackDawson

Share this post


Link to post
Share on other sites

Ummm what us going on with this? I have been stuck on Connecting to compute pool.... Scam? broken?

 

 

It's not a scam. But it will take you MONTHS before you can gain any kind of decent amount of iron with it. To make 10 silver you need 100,000 Iron. My high end NVidia video card roughly produced about 40 iron an hour for me, or so it was telling me. I tried it for 4 hours but only got 66 iron total after four hours. So I think it is broke or you could be right, maybe it is a scam. I'm actually not sure anymore. It would be great if this gets fixed and I can gain enough iron to pay for my account each month. But, after doing the math, I can see this is never possible.

Edited by JackDawson

Share this post


Link to post
Share on other sites

Ummm what us going on with this? I have been stuck on Connecting to compute pool.... Scam? broken?

 

Has been up for me with no interruption in the statistics graph, but I use cgminer directly on the pool, and not the GPU client (since I don't even run my GPU anymore).  Maybe coinlab is doing some experiments with other work units with the decline in bitcoin value recently.

 

Interesting since you must have the OpenCL to run it. OpenCL is an NVidia interface for the CUDA cores. So how does this work for AMD ? I have an AMD Radeon HD 6450 sitting here. I tried to install openCL for AMD and it didn't work. Told me I needed an NVidia card to install this. Won't install without CUDA cores.  Any ideas ? And how much faster does it mine with your AMD ? And what AMD card do you have ?

 

Actually, CUDA is the nVidia specific interface to their stream processors (or what I called shaders in previous posts).  OpenCL is the generic interface that AMD/Intel/nVidia all support.  So nVidia has better designs in their stream processors, allowing them to behave as CUDA cores, while AMD has a simpler stream processor design, meaning they need more of them to compete with the graphics performance of the same generation nVidia cards, and cannot be used as CUDA cores.

 

My retired 5770 has 800 stream processors @ 850MHz and mines at 200MH/s (roughly 90i per hour at current rate using 100W)

Your 6450 has 160 stream processors @ 750MHz and should mine at 30MH/s (roughly 13i per hour using 25W)

 

Newer generation does not mean better performance as stream processor designs get better, so less are put in the GPU

5850 has 1440 stream processors @ 725MHz and should mine at 300MH/s

6850 has 960 stream processors @ 775MHz and should mine at 200MH/s

7850 has 1024 stream processors @ 860MHz and should mine at 270MH/s

 

It's not a scam. But it will take you MONTHS before you can gain any kind of decent amount of iron with it. To make 10 silver you need 100,000 Iron. My high end NVidia video card roughly produced about 40 iron an hour for me, or so it was telling me. I tried it for 4 hours but only got 66 iron total after four hours. So I think it is broke or you could be right, maybe it is a scam. I'm actually not sure anymore. It would be great if this gets fixed and I can gain enough iron to pay for my account each month. But, after doing the math, I can see this is never possible.

 

Looking at the GTX660TI, it does have 1344 stream processors unlike the GTX560TI which only has 384 stream processors (compared to the 1280 of the HD7870 and 1120 of the HD6870).  But likely the historically bad performance of nVidia cards has made the active development of CUDA code non-existent.  And I highly doubt it will be fixed unless you are the one who re-writes the code, since most active development these days are in ASIC (not even FPGA anymore, which is what I am mining with, 800MH/s using 40W).  The reason ASIC is the hot topic is 1000MH/s only uses 10W.

 

No it is not a scam, I have gotten almost 2 gold from it so far.  I mined with 5770 + FPGA until I got to 1 gold.  The second gold was because I couldn't find a better place to point my FPGA to after playing around with various pools like p2pool.

Share this post


Link to post
Share on other sites

What Miner app can I use to log-in with this information:


 


Host: http://pool.coinlab.com:8332


User: user_(all lower case)


PW: x


 


I have tried GUI-Miner on my Windows 7 64-bit machine and I had the information entered properly and nothing happened. I clicked "start mining" and it didn't even do anything. When adding the "worker" I needed to pick a "type" and chose "CG Miner," is that right? (I have no idea...)


 


So, I want to try another but I'm new to this... I want to learn more about digital currency and more specifically, BitCoins. However, for now, I just want to make Wurm in-game currency through CoinLab.


 


I use the CoinLab Wurm Miner and my stats seem low for my GPU. I have a GeForce GTX 680 and it's not displaying my "Current Mining" but my speed reads: Mining Speed - 45.1 Iron/hr


 


Needless to say, it's going to take a long time before I reach anything substantial and I'd like to know why since my GPU isn't bad. It's actually a beast on gaming and for my rig...


 


Appreciate any help, thanks guys!


Share this post


Link to post
Share on other sites

So, I want to try another but I'm new to this... I want to learn more about digital currency and more specifically, BitCoins. However, for now, I just want to make Wurm in-game currency through CoinLab.

 

I use the CoinLab Wurm Miner and my stats seem low for my GPU. I have a GeForce GTX 680 and it's not displaying my "Current Mining" but my speed reads: Mining Speed - 45.1 Iron/hr

 

 

I suggest you just use the Coinlab client for now since you have it working, and head over to

 

https://en.bitcoin.it/wiki/Main_Page

and

https://bitcointalk.org/

 

and start reading about bitcoins.  I haven't used GUIMiner, so I don't know how to control it.  CGMiner is the command line tool which I use, so I am not sure whether you need to install both CGMiner and GUIMiner, or if GUIMiner comes with a version inside it.

 

https://bitcointalk.org/index.php?topic=178533.0

 

This suggests you may need some extra flags to CGMiner to detect nVidia cards.  But in the same thread, someone got 110MH/s with CGMiner and GTX680, which means the Coinlab client is already giving you the same return rate.  I think Coinlab is already the easiest "getting started" mining client.

 

When you should start looking at another miner is when you want to mine for bitcoin instead of iron.  (and read my earlier post on why your great GPU isn't giving you a great return)

Edited by Odi

Share this post


Link to post
Share on other sites

I see you're the top of the Hashrate Leaderboard, Odi. Nice!


 


I still don't understand why a GeForce GTX 680 would only get 45 iron/hr. Also, how can I see my accurate Hashrate with the CoinLab Wurm client? It's only showing Iron/hr...


Share this post


Link to post
Share on other sites

ok i have an ASUS G73jw with a NVIDIA GTX460M, how does this work out for this bitcoin thing? I


Share this post


Link to post
Share on other sites

I see you're the top of the Hashrate Leaderboard, Odi. Nice!

 

I still don't understand why a GeForce GTX 680 would only get 45 iron/hr. Also, how can I see my accurate Hashrate with the CoinLab Wurm client? It's only showing Iron/hr...

Exactly my point..  mine is giving only 40 an hour with my GTX 560Ti. And after 4 hours I still only got 66 coins. None of this adds up correctly.

Edited by JackDawson

Share this post


Link to post
Share on other sites

Has been up for me with no interruption in the statistics graph, but I use cgminer directly on the pool, and not the GPU client (since I don't even run my GPU anymore).  Maybe coinlab is doing some experiments with other work units with the decline in bitcoin value recently.

 

 

Actually, CUDA is the nVidia specific interface to their stream processors (or what I called shaders in previous posts).  OpenCL is the generic interface that AMD/Intel/nVidia all support.  So nVidia has better designs in their stream processors, allowing them to behave as CUDA cores, while AMD has a simpler stream processor design, meaning they need more of them to compete with the graphics performance of the same generation nVidia cards, and cannot be used as CUDA cores.

 

My retired 5770 has 800 stream processors @ 850MHz and mines at 200MH/s (roughly 90i per hour at current rate using 100W)

Your 6450 has 160 stream processors @ 750MHz and should mine at 30MH/s (roughly 13i per hour using 25W)

 

Newer generation does not mean better performance as stream processor designs get better, so less are put in the GPU

5850 has 1440 stream processors @ 725MHz and should mine at 300MH/s

6850 has 960 stream processors @ 775MHz and should mine at 200MH/s

7850 has 1024 stream processors @ 860MHz and should mine at 270MH/s

 

 

Looking at the GTX660TI, it does have 1344 stream processors unlike the GTX560TI which only has 384 stream processors (compared to the 1280 of the HD7870 and 1120 of the HD6870).  But likely the historically bad performance of nVidia cards has made the active development of CUDA code non-existent.  And I highly doubt it will be fixed unless you are the one who re-writes the code, since most active development these days are in ASIC (not even FPGA anymore, which is what I am mining with, 800MH/s using 40W).  The reason ASIC is the hot topic is 1000MH/s only uses 10W.

 

No it is not a scam, I have gotten almost 2 gold from it so far.  I mined with 5770 + FPGA until I got to 1 gold.  The second gold was because I couldn't find a better place to point my FPGA to after playing around with various pools like p2pool.

 

Yes, there is still 3D software and games that will take advantage of it. Although I agree, it is very little. But it works great when you do get software that take advantage of it.

 

So what your saying is that since BitCoin decided to use OpenCL, that it doesn't work on CUDA as native, but works on AMD video cards because it uses Stream Processors that the OpenCL was made for ?

 

So does this mean ALL NVidia cards are screwed ? Oh and that AMD HD 6450 I mentioned that I had in an earlier post, still doesn't work. I still get the message "Missing OpenCL". I am trying to find a way around this.

 

You said you got two gold coins.  How long you been mining to get two gold coins ? Remember it takes 100,000 iron to get 10 silver. And for us who can gain only 66 per / 4 hours .. it would take 9 months to mine that 10 silver.

 

NOTE for those on INTEL : I am on the HD 4000 on another computer. IT said I would get 30 iron / PER DAY. This is totally not worth it on Intel graphic cards, so don't bother trying. You would end up burning it out.

Edited by JackDawson

Share this post


Link to post
Share on other sites

ok i have an ASUS G73jw with a NVIDIA GTX460M, how does this work out for this bitcoin thing? I

What Odi is saying in his past two pages of messages is..  give up if your using NVidia video card. It's not worth it.

Edited by JackDawson

Share this post


Link to post
Share on other sites

What Odi is saying in his past two pages of messages is..  give up if your using NVidia video card. It's not worth it.

 

Not exactly.  I am just trying to give the facts.  I mentioned that it looks like the GTX660TI and above have enough stream processors to make it a decent mining card (if someone wanted to write the code to do it).  Whether it is worth it or not is up to the individual to decide.  I personally don't even run my 5770 currently due to heat it generates in the summer.

 

So what your saying is that since BitCoin decided to use OpenCL, that it doesn't work on CUDA as native, but works on AMD video cards because it uses Stream Processors that the OpenCL was made for ?

 

So does this mean ALL NVidia cards are screwed ? Oh and that AMD HD 6450 I mentioned that I had in an earlier post, still doesn't work. I still get the message "Missing OpenCL". I am trying to find a way around this.

 

You said you got two gold coins.  How long you been mining to get two gold coins ? Remember it takes 100,000 iron to get 10 silver. And for us who can gain only 66 per / 4 hours .. it would take 9 months to mine that 10 silver.

 

 

Bitcoin is not an entity.  It is an open source currency that is community developed.  Coinlab is an example of an individual coming up with an idea and implementing it.  Some people have written CUDA miners in the past, but that was in 2011 when nVidia cards just didn't have the raw stream processors to make it worthwhile.  I suspect if someone had the drive, energy, knowhow, and incentive to do so now, they could squeeze some reasonable hash rate from the GTX660TI and above.

 

For the 6450, you would need to install AMD APP SDK plus the accompanying Catalyst driver.  That is what provides OpenCL drivers for the card.  But you would only get 13 iron per hour with it because it is a weak card.

 

Since mid April, I mined with 5770 + FPGA for 1GH/s until I got 1 gold by mid May (exchange rate of USD to BTC was high and network difficulty was lower).  Then from mid May till now, I only ran the FPGA for 800MH/s because the higher network difficultly made the iron per share much lower.  In the 2 months (including some downtime when trying out p2pool and also moving my FPGA to a Raspberry Pi), I have made 90 silver.

 

Exactly my point..  mine is giving only 40 an hour with my GTX 560Ti. And after 4 hours I still only got 66 coins. None of this adds up correctly.

 

Do you have a 560TI or 660TI?  560TI only has 384 stream processors, while the 660TI has 1344.  Think of it as a stream processor can compute a cryptography hash in X cycles, so the more stream processors you have, the more you can calculate in parallel, and the faster the processor, the more cycles happen in 1 second.

 

ok i have an ASUS G73jw with a NVIDIA GTX460M, how does this work out for this bitcoin thing? I

 

Mobile cards are really bad since they don't have good enough cooling.  And it only has 192 stream processors

 

I see you're the top of the Hashrate Leaderboard, Odi. Nice!

 

I still don't understand why a GeForce GTX 680 would only get 45 iron/hr. Also, how can I see my accurate Hashrate with the CoinLab Wurm client? It's only showing Iron/hr...

 

Coinlab tries to simplify mining by cutting out all the conversion from MH/s to bitcoin to USD to iron.

 

And I suspect if someone made special code for the GTX 660 and above, they could get maybe double or even triple the iron, since the 6xx seems to have triple the stream processors of the 5xx.  But since GPU mining is on the way out these days, I don't know who would take the time to do so.

 

And that top hash rate is not even with a GPU, it is with an FPGA.  Sometimes some others are on top when they turn on their powerful AMD cards, but mine only consumes 40W so I leave it on there to earn me some coin.

Edited by Odi

Share this post


Link to post
Share on other sites

Hey guys just want to know if this is still worth it. Want to get at least 10 silver so I can go premium and try it out.

I have a gtx 780

Share this post


Link to post
Share on other sites

Do you have a 560TI or 660TI?  560TI only has 384 stream processors, while the 660TI has 1344.  Think of it as a stream processor can compute a cryptography hash in X cycles, so the more stream processors you have, the more you can calculate in parallel, and the faster the processor, the more cycles happen in 1 second.

 

 

 

I have the GTX 660Ti with 3 Gig Ram on the video card by EVGA. It has 1344 CUDA Cores. I still only got 66 iron after 4 hours of mining. I at first didn't think this was a scam, but now i'm not so sure. Obviously I didn't get what I am owed.

 

This is my card on newegg : http://www.newegg.com/Product/Product.aspx?Item=N82E16814130811

 

I have a friend who let me use his computer and he has the same exact card. We bought ours at the same time. On his computer it only came up to around 71 iron after 4 hours of mining.

Edited by JackDawson

Share this post


Link to post
Share on other sites

Hey guys just want to know if this is still worth it. Want to get at least 10 silver so I can go premium and try it out.

I have a gtx 780

Good luck with that. I wanted to try to do the same thing. It's starting to look like this whole thing is a scam. The only one who seems to be saying this works out for them is Odi. No one else has stepped up saying this works out for them. I mined for 4 hours and only got 66 Iron. At that rate it would take me 9 months to get 10 silver. It's not worth it.

Edited by JackDawson

Share this post


Link to post
Share on other sites

nvidia is terrible dude. I get 700+mhs on my amd 7970. I get over a silver a day with all of my systems going at home when I am not playing. I turn off the mining when I am gaming, which is usually 5-9hrs a day.


 


 


I am not moving my office stuff to this though at nearly 8ghs. Just not worth it, i use the btc as income for by business. yes I pay taxes on it but it keeps me going when there are slow times. 


Edited by mdtrav

Share this post


Link to post
Share on other sites

I have the GTX 660Ti with 3 Gig Ram on the video card by EVGA. It has 1344 CUDA Cores. I still only got 66 iron after 4 hours of mining. I at first didn't think this was a scam, but now i'm not so sure. Obviously I didn't get what I am owed.

 

 

It is not a scam.  I agree you definitely aren't getting what your device is capable of, but it is not because someone is scamming the profits to themselves.  It is because the community hasn't developed the program/firmware/whatever to take advantage of all your hardware can deliver.

 

The last time the community actively developed for nVidia was in 2011, nVidia cards only had 1/10th the stream processors compared to the equivalent price tier of the AMD cards.  The reason development has stagnated on GPUs is that even the AMD ones are barely making back the electricity you use to run them.

 

The GTX680 shows https://en.bitcoin.it/wiki/Mining_hardware_comparison that it should also earn roughly 50i/h (roughly 110MH/s), so with current software, 40i/h for yours seems about right.  Note that 40i/h is an estimate.  Think of it like this, you roll a 100 sided die over and over as quickly as you can, each time it hits 1, you get 0.5 iron.  Coinlab is calculating that you should get an average of 40i/h, but some hours may only be 10i when you are unlucky and other hours may get 90i when you are luckier.

 

I suspect most of the cores on your card are not being utilized right now and need some software rewrite to be utilized (that someone from the community needs to write).  Maybe that community member would be you, or some other person (with the knowhow) with a GTX660 or higher.

 

 

The only one who seems to be saying this works out for them is Odi.

 

That's because I found out about Wurm Online from the bitcointalk.org marketplace forum.  Someone was selling their account for bitcoins, and when I saw the post, I looked into it and got hooked!

Edited by Odi

Share this post


Link to post
Share on other sites

To support what Odi said back on the last page i was getting around 220 iron an hour with my Radeon HD6850 in May, I know it dropped recently with the bitcoin computation values. (I'll provide an update of the new values when I get home).


 


We are on the tail end of this folks, big bitcoin was to be made months and months ago, mining is not a set rate, it fluctuates by the saturation of the market with more equipment.  The speed of bitcoin processing globally has more than doubled from June 1st to July 13th, raising the network capacity(difficulty) and lowering the value of computation cycles.  The secret is out, since there is real money to be made in bitcoin companies have sprung up and boxes being built now that are designed just for bitcoin mining - these boxes are driving the value of low end computation like ours (and it is low end, even at 220 iron an or whatever hash per hour) to the ground floor. 50 GH/s and 500 GH/s boxes make gpu's look like a Prius pulling a tree uphill.  I wouldn't expect to see a lot of new code for running miners on gpu.


 


All this said, I'm going to keep running it for a while yet and see where it goes, maybe something will change and drive the value back up........


 


 


Update: the client says i'm mining around between 62 ad 70 (fluctuating as I watch it) iron per hour so yea it's dropped about 71% from a few weeks ago - not the speed of the machine just the value of the computations. 

Edited by Butoxy

Share this post


Link to post
Share on other sites

To support what Odi said back on the last page i was getting around 220 iron an hour with my Radeon HD6850 in May, I know it dropped recently with the bitcoin computation values. (I'll provide an update of the new values when I get home).

 

We are on the tail end of this folks, big bitcoin was to be made months and months ago, mining is not a set rate, it fluctuates by the saturation of the market with more equipment.  The speed of bitcoin processing globally has more than doubled from June 1st to July 13th, raising the network capacity(difficulty) and lowering the value of computation cycles.  The secret is out, since there is real money to be made in bitcoin companies have sprung up and boxes being built now that are designed just for bitcoin mining - these boxes are driving the value of low end computation like ours (and it is low end, even at 220 iron an or whatever hash per hour) to the ground floor. 50 GH/s and 500 GH/s boxes make gpu's look like a Prius pulling a tree uphill.  I wouldn't expect to see a lot of new code for running miners on gpu.

 

All this said, I'm going to keep running it for a while yet and see where it goes, maybe something will change and drive the value back up........

 

 

Update: the client says i'm mining around between 62 ad 70 (fluctuating as I watch it) iron per hour so yea it's dropped about 71% from a few weeks ago - not the speed of the machine just the value of the computations. 

 

So if you ran your GPU solid for TWO months straight, you would finally make 10 silver.

 

70 iron  x 24 hours  x 30 days x 2 months = 100800 iron which comes to little over 10 silver. By that point, the card could be burnt out. And you'll end up spending more money then the 10 silver was worth just to replace the video card. Food for thought.

Edited by JackDawson

Share this post


Link to post
Share on other sites

Hi!

I've been using mining client quite a bit. It used to work at 180-200 i/hr with my GeForce GTX680. But suddenly it dropped to just 35-40 per hour. I've tried updating my drivers but that did not helped. What could have happened?

 

Edit: nevermind, went through the topic and understood it.

Edited by Chinpow

Share this post


Link to post
Share on other sites

im confused i have a evga 650ti boost and i only get like 12 iron an hour when im playing and i peak at 45 when i have everythink off is there something i can change im on turbo and my gpu temp never gets over 60C. do i need to turn down display setting switch to one monitor use while clients in use?


 


Share this post


Link to post
Share on other sites

I'm going to stop bitcoining when I hit 3 gold, which should be pretty soon. I've already shut down one PC and haven't run the 7970 big iron in a while.  Just have one PC with a pair of 7770's and an A10-5800k APU all bitcoining. The 7770's do about 140 Mh/s each and the Trinity APU about 110.  CGMiner is my bitcoining program of choice, it works with any APU/GPU and has a vast array of safety controls to help insure you don't kill your GPU's.

Share this post


Link to post
Share on other sites

Your terms of service and Privacy policy are unavailable at this time error 404, any eta on when they will be back up?


Share this post


Link to post
Share on other sites

I would say to just stay away from bitcoin mining. It isn't worth the extra strain it puts on your machine.


 


Back when less people did it, it was worth doing, you could make some nice coin doing it. Now all it does is burn electricity.


Edited by Outlaw

Share this post


Link to post
Share on other sites

Asus 7790 OC -120Euro, FX6300 Core


 


70-75/h normal OC tact(1050) up to 55 degree


80-75/h max tact(1200mhz) up to 65 degree


 


both without high CPU use


 


OC tool = msi afterburner


Edited by Sklo:D

Share this post


Link to post
Share on other sites

So I need to figure out one of two things. Either:


 


A: How do I get the "new" client to work for me (it constantly has me at Connecting to Compute Pool) or


 


B: How do I install the old client when it keeps telling me a new version is installed, as I have scoured my computer trying to uninstall it. (I've used the old version in the past successfully, but cannot get it working anymore)


 


Or more specifically, what files does it install on your computer, because I'm getting this feeling that not all of the files have been uninstalled by the uninstaller provided...


 


Update: Found a folder in Appdata/Local/CoinLab, however even after deleting that, I'm still getting a message stating that a new client is installed. And when trying the new client after that, I still cannot connect to the compute pool.


Edited by Russianranger

Share this post


Link to post
Share on other sites
Guest
This topic is now closed to further replies.
Sign in to follow this