A Full Guide To Getting Started With Mining Litecoin

Missed out on making millions with Bitcoin (what’s Bitcoin, you ask)? Don’t worry – you’ve still got time to jump in on alternative cryptocurrencies, like Litecoin and Feathercoin. Thanks to a different algorithm at the core, those can still be mined by anyone with a graphics card – expensive ASIC Bitcoin miners won’t work. What are you waiting for?!

A Full Guide To Getting Started With Mining Litecoin

A Full Guide To Getting Started With Mining Litecoin

Kannon published a tutorial back in July about getting started with Litecoin, but the software tools have improved since then, so I thought it would be worth publishing an update. One point to note is that at the time of Kannon’s article, Litecoins were worth less than $3, they’ve now increased to 10 times that value and are significantly more difficult to mine. If you were paying attention, unlike me, then congratulations on your newly acquired fortune – hold onto them!

Disclaimer: There are a lot of alternative cryptocurrencies out there, some of which are attempts to get rich quick by the coin’s creator. I’m suggesting Litecoin because it’s the next largest cryptocurrency that’s still possible to mine, but obviously I can’t guarantee the whole cryptocurrency bubble won’t be a complete failure and be banned in every country in a few years. That’s exactly what stopped me mining Bitcoins two years ago now, a decision I clearly regret. Worst case scenario, you’re investing in a nice gaming machine. If you’re willing to take a gamble on other cryptocurrencies, check CoinWarz for an up-to-date profitability list and limit your search to “Scrypt”-based currencies – those can all still be mined using graphics cards.

Check Viability

First, you’re going to need a good graphics card. It is possible to mine using a CPU, but it’s so ridiculously slow it’s just not worth it, even with a top-of-the-range CPU. ASICs designed for Bitcoin mining are also ineffective due to the different algorithm used to mine Litecoins. That’s the thing about ASICs – they’re really good at one dedicated task and useless at anything else.

AMD Radeon cards (like this XFX AMD Radeon HD 7750) are overwhelmingly better than equivalently priced NVIDIA cards due to the different architecture (more, slower processors; as opposed to fewer, faster processors), as well as a particular math operation that can be done in one cycle on AMD but requires 3 cycles on NVIDIA cards. Remember, we’re not talking about gaming there, so this certainly doesn’t mean AMD cards are better than NVIDIA at gaming performance; purely in terms of the math required to mine crypto currencies.

Of course, running a high-end graphics card on full load 24/7 requires a lot of power; so much, in fact that unless your graphics card produces more than a certain rate of kilohashes per second, you’ll be making a net loss due to power costs (unless your power is free, in which case, have at it with anything you’ve got). Therefore, it’s important you either invest in a new card, or accurately check what your current card is capable of.

hardware-compare

 

The easiest way is to check the Litecoin mining hardware comparison chart and see if your card is listed – take the lowest figure there as a realistic estimate of what you can achieve easily, which will give your calculation a little room for error and allow for idling time due to network issues, etc. Plug that number into this calculation tool, along with an estimate of power consumption and power costs for your area; and out will come an idea of how much “profit” you’ll be making – but bear in mind the difficulty will increase rapidly.

profitability

 

For a more realistic approach to Return-On-Investment rather than daily profit, check out this Reddit thread.

At this point, it may be painfully obvious that you may even end up making a net loss with your current setup.

For this reason, I invested in a new graphics card, a top of the line AMD Sapphire Radeon R9 290. It hits a sweet spot between power consumption, and production rate, even though the purchase cost is quite high. It’s not just the graphics card, of course – the new card required more power than my 350 watt PSU could provide, so that’s another $100 for a new PSU.

r9-290

 

Join a Pool

You could mine solo, but it’s harder, more work to set up, and the rewards aren’t as reliable. Joining a pool means you get paid for all the work do you, whether or not you actually find a block; it’s like a lottery syndicate. Check out the current list of mining pools here. Once you’ve joined a pool, you should find somewhere to “manage workers”. From that page, you’ll find a worker name and password (leave it as x, you don’t need one), and there should be a list of that pools servers somewhere too. Note these all down.

Litecoin BAMT

The mining scripts are available for Windows as Kannon described before, but I experienced so many driver problems and variations in performance that it just wasn’t worth the trouble. BAMT is a 2 GB linux distro that boots off USB – it’s dedicate to the task of mining, and it does it well. Why mess around? After two days of thinking I had a dud card — mining would work for a few minutes then any tweaks would kill it and require a complete re-install of Windows (yes, seriously) — I was up and running with a BAMT Live USB in 10 minutes. Seriously, don’t use Windows.

Forgive the bad photo - my Linux screenshot skills are basically non-existent

Forgive the bad photo – my Linux screenshot skills are basically non-existent

At the time of writing, the latest Litecoin enabled BAMT is version 1.2 and it works fine with my R9 290 graphics card, with no driver updates. If you have a newer card, you may need to perform some updates. Download Litecoin BAMT here, and useWin32 Disk Imager to burn it onto a 2 GB USB flash drive. Boot from it, and presto – you’re mining. Unfortunately, you’re mining for someone else since you haven’t set it up with your own addresses yet. Do that now.

Note: If you’re planning on only using LTC Rabbit, they have their own customised version of BAMT which you might want to look into – the rest of these instructions should still apply.

Using BAMT Headless

Log in via SSH – the IP address should be shown on the BAMT desktop – as follows:

ssh user@IP

The default password is “live”.

Switch to superuser mode:

sudo

Initial configuration needs to be done by editing the /etc/bamt/cgminer.conf

nano /etc/bamt/cgminer.conf

Read below for details on what to put in this file, but at the very least adjust it to use your mining pool login credentials. When you’re done, save the file and restart the mining process with:

/etc/init.d/mine restart

To view the cgminer output and make interactive changes, run:

screen -ls

Note the number you find next to the CGMiner process. “Attach” to that screen with the command (replacing 8213 with the number your screen output reported).

screen -r 8213.cgminer

From there, you can enter interactive mode, or just keep an eye on your card.

Here’s a sample output:

cgminer-output

 

The interactive text menu allows you to change settings without permanently adjusting the config file. For exactly, to push the GPU memory clock to 1300, you would press G -> C -> M -> 1300 -> Enter.

Configuring cgminer.conf

Before you do anything else, open the cgminer.conf file for editing. Right now it has someone else’s mining pool details in there, so you’ll need to change that as soon as possible and restart. Note that the user entry is a concatenation of your username on that mining pool, a period, and the worker name; the password is usually just x – don’t enter your actual mining pool password – this is just for the worker.

{
"url" : "stratum+tcp://stratum.give-me-ltc.com:3333",
"user" : "username.workername",
"pass" : "x"
}

If you’ve only joined one pool, delete the details for the additional one, and remove the comma that’s currently separating the two. At some point, you’ll want to join an additional pool as a failover in case the first one is down or does’t have work units available.

This file is JSON format, so each property and value must be surrounded by double quotes, and a single colon (:) separates the “property”:”value”. Between property value pairs, you need a comma.

The exact settings you should use will vary by the graphics card – you can search on Google to find something suitable. Here’s some general advice to keep in mind:

  • Intensity is a basic measure of how hard to work the graphics card, from 0-20. The faster you work it, the hotter it will get.
  • You can keep the temperature constant by setting auto-fan to true and defining a temp-target (a target temperature to maintain). When it goes past this, you’ll hear the fans whack into full blow.
  • Keep the GPU engine clock lower than normal, and keep the memory clock as high as possible (whilst still being stable) – you’ll need to experiment with your graphics card to find what works best.
  • Don’t run at 90 Celsius the whole time unless you want to significantly reduce the life of your card.

You should have no trouble finding a suitable configuration for your graphics card, so a quick search on Google will return lots of possibilities to try. Here’s a sample one for R9 series cards. Setting up multiple cards is more difficult, but not impossible.

Finally: Get a Wallet

At some point you’ll want to be paid, and for that you’ll need a wallet to “receive” the payments. Your wallet is an application which keeps updated with the network, synchronising transactions. Since your mining machine may be regularly upgrade, reformatted, or otherwise messed around with, it’s a good idea to download the wallet to a more reliable machine that you keep backed up. Head over to litecoin.organd download one suitable for your OS. Alternatively, you can store your currencies in an online currency exchange wallet such as BTCe.

The first few hours your wallet will be out of sync, meaning it’s still catching up on all the previous transactions.

out-of-sync

 

You can check the Help -> Debug window menu to see the block chain information. When the “Last block time” is roughly the current date, you’ll be in sync again. While out of sync, none of your transactions will show correctly, so if you’ve transferred some in already you’ll need to wait.

wallet-syncing

 

Cryptocurrency theft has made malware and hacking even more viable than ever – now there’s actual money sitting on your computer, and not just personal information. In some cases, people have lost thousands of coins. For this reason, you’ll want to encrypt your wallet from Settings -> Encrypt Wallet. Absolutely don’t forget your passphrase, obviously – that would be like losing your real life wallet with cash in it.

I’m probably not going to make a huge amount of money mining Litecoins; but my ageing GeForce 7800GT was due an upgrade anyway to get ready for the VR revolution. Did you take Kannon’s advice back in July? Are you still skeptical of cryptocurrencies, or are you ready to have a go for yourself?

 

Benchmark performance: iOS 6 vs iOS 7 on the iPhone 5

With iOS 7 now available for iPhone users, we have been getting queries asking if there has been any performance impact, negative or positive, after the upgrade to the latest OS. We attempt to answer those questions with the help of benchmarks.

Benchmark performance: iOS 6 vs iOS 7 on the iPhone 5

Benchmark performance: iOS 6 vs iOS 7 on the iPhone 5

Apple has made iOS 7 officially available now for the iPhones, the iPad and the latest generation iPod Touch. There are the fans, and there are those criticizing it, but one thing cannot be disputed – this is the biggest iOS upgrade in years. The changes are significant, with the most visible one being the new UI. Hence, the inevitable question – how is the performance of the new OS?

Users have been asking us this query for quite some time now, but we refrained from pronouncing any judgement on iOS 7’s performance while it was in the various pre-launch stages. However, now that the final version is out, we have run a series of benchmark tests on the iPhone 5 running the iOS 6.1.4 version, before upgrading OTA to iOS 7 and running the same benchmarks again, after checking for any updates for them, individually.

To ensure that no benchmark run had any advantage or disadvantage, all the tests across both OSes were run on the same device, with the exact same settings. We even ensured that something like Bluetooth was switched off at all times. The applications installed remained consistent across all benchmark runs, on both iOS versions. We manually closed all apps that were running in the background, so that the app load didn’t have a bearing on the performance and the resources. For the browser benchmark tests, we hooked up the phone to the same Wi-Fi network, with no other device connected to the hotspot at that time.

  

To break it down even further, we have subdivided the tests into categories – system performance, graphics performance and web browser performance.

System Performance: Better memory handling makes the difference
The Passmark Mobile Performance Test registers a lower system score in iOS 7, but critically, better read and write scores for both the internal storage as well as the memory. This is critical, because for most tasks that don’t need raw processor power, the read and write speeds make all the difference between smooth performance and a sluggish setup. The CPU score also goes up with iOS 7, showing that the new OS is ever so slightly better optimized. We mention that the overall system score goes down, but that could be a factor of the graphics tests that the benchmark runs, which we will not pay much attention to since we have dedicated graphics benchmark tests coming up.

Passmark Performance Test Mobile
iOS 6.1.4 iOS 7 % Difference
System Score 3932 3603 -8
CPU Score 24922 25418 2
Disk Score 13154 13480 2
Read (in Mbyte/s) 44.9 44 -2
Write (in Mbyte/s) 157 164 4
Memory Score 2998 3295 10
Read (in Mbyte/s) 517 566 9
Write (in Mbyte/s) 509 569 12
2D Score 1641 1250 24
3D Score 1756 1762 -0.34

The second system test, GeekBench, also verifies what PassMark says about the better processor performance, with the multi-core utilization tests showing up a higher score than on iOS 6.

GeekBench
iOS 6.1.4 iOS 7 % Difference
Single Core Score 722 722 0.00
Multi-Core Score 1298 1302 -0.31

Graphics Performance: Improves slightly 
While not everyone likes to game on their iPhone, there are those who do. And they were served rather well by the very smooth performance and experience that iOS 6 offered with this particular hardware. After the iOS 7 upgrade, the 3D Mark Ice Storm and the 3D Mark Ice Storm Ultimate benchmark tests register slightly better scores, while the Ice Storm Extreme score remains the same. Simply put, the excellent exiting performance via iOS 6 improves even further with the new operating system.

3D Mark
iOS 6.1.4 iOS 7 % Difference
Ice Storm 5418 6011 11
Ice Storm Extreme 3351 3351 0.00
Ice Storm Ultimate 5516 5711 4

GFX Benchmark did give a slightly different analysis, with one test registering pretty much the same frames and fps scores, while the the second test saw a drop in both frames and fps.

GFX Bench
iOS 6.1.4 iOS 7 % Difference
Egypt HD (Offscreen) – Frames 3357 3358 0.3
Egypt HD (Offscreen) – fps 30 30 0.00
Egypt HD (Onscreen) – Frames 4572 4185 -8
Egypt HD (Onscreen) – fps 40 37 -8

Browser Test: Safari sees an improvement, mostly visual elements
The SunSpider 1.01 browser benchmark shows slightly better performance, in terms of Java performance. And for a web browser, even negligible differences can sometimes make lot of difference, and in this case, it seems to be working.

Sunspider 1.01 
Safari on iOS 6.1.4 Safari on iOS 7
Score (lower is better) 729.1ms 707.7ms

From the scores, it is clear that the performance difference between iOS 6 and iOS 7, at least on the iPhone 5 hardware, is not much. Critically, it shows a very slight improvement in areas like memory and storage access speeds, which make for an improvement in the overall speed and experience. All in all, you can safely upgrade your iPhone to the latest OS, with no fear of a performance drop, at least on the iPhone 5.

Have you updated you iPhone or iPad or the latest iPod Touch with iOS 7? We would like to hear your experience.

Source: http://www.thinkdigit.com/Mobiles-PDAs/Benchmark-performance-iOS-6-vs-iOS-7_17646.html

 

Samsung denies claims that it doctored Galaxy S4 benchmark results

Samsung‘s official statement still ignores certain findings, including the existence of the ‘BenchmarkBooster’ string in the Galaxy S4’s system files.

Samsung denies claims that it doctored Galaxy S4 benchmark results

Samsung denies claims that it doctored Galaxy S4 benchmark results

Samsung has released a statement refuting all accusations that it engineered the Galaxy S4’s hardware to perform better when running certain benchmarks. In the statement posted on its blog, Samsung says, “The maximum GPU frequencies for the Galaxy S4 have been varied to provide optimal user experience for our customers, and were not intended to improve certain benchmark results.” Samsung’s statement comes in the wake of yesterday’s news that the Galaxy S4’s CPU and GPU were clocking at higher than normal speeds when running recognized benchmarks. According to a Galaxy S4 user on the Beyond 3D forums and AnandTech, the Galaxy S4’s GPU is clocked at a default speed of 480MHz when running regular apps but speeds up to 532MHz when running certain popular benchmarks.

Samsung’s blog post states that the GPU inside the Galaxy S4’s Exynos 5 Octa SOC had a theoretical maximum clock speed of 533MHz but that was brought down to 480MHz for certain gaming apps “that may cause an overload, when they are used for a prolonged period of time in full-screen mode.” The post further goes on to say that the Galaxy S4 does allow the GPU to work at its maximum clock speed when “running apps that are usually used in full-screen mode, such as the S Browser, Gallery, Camera, Video Player, and certain benchmarking apps, which also demand substantial performance.” In an update, AnandTech has revealed that almost none of the Galaxy S4’s first party apps mentioned in the statement actually push the GPU to 532MHz and the behaviour seems to be limited to the popular benchmarks. They added that the camera is the only first party app that pushes the GPU to the maximum clock speed but that speed is not sustained. In contrast, running popular benchmarks on the Galaxy S4 seems to elicit the higher clock speeds for a longer duration.

Apart from the above inconsistency, Samsung’s clarifications have still left other questions unanswered. Samsung seems to have strangely left out commenting on certain things that were uncovered by AnandTech. For instance, AnandTech found that the Galaxy S4’s Exynos 5 CPU also runs at maximum clock speeds when running certain benchmarks, irrespective of the load put by the benchmark and that when a regular app (or a non-recognised benchmark) is run, the clock speeds get lower, even if the app (such as GFXBench 2.7.0.) is similar to the recognised benchmarks that “demand substantial performance.”

Samsung has also entirely ignored the existence of the ‘BenchmarkBooster’ string that AnandTech found in one of the Galaxy S4’s system files. The ‘BenchmarkBooster’ string makes for interesting reading since it actually refers to benchmarks by name that have been given permissions to boost the GPU clock speeds to the maximum.


An extract from the ‘BenchmarkBooster’ string.

However, like we stated in yesterday’s story on the same issue, this behaviour by the Galaxy S4 doesn’t affect you, the user, in a major manner. Samsung’s decision to set the actual GPU clock speed lower than the theoretical maximum is standard practice because otherwise you run the risk of your device overheating and even failing. However, the problem lies in the fact that Samsung set certain rules for the Galaxy S4’s operation and then deliberately let certain benchmarks break those rules just so that they could post higher numbers. This is disingenuous as it does not let benchmarks paint an accurate picture of the Galaxy S4’s performance and it also does not let potential buyers compare the S4 with other competing devices accurately with the help of benchmarks.

If you are in the market for a new smartphone, our advice, as always, would be to rely on real-world use and your comfort with the device rather than synthetic benchmark figures. Even if you do tend to refer to benchmarks, don’t let those numbers be a big factor in helping you decide on a smartphone.