Malware on your Android device picked up from third party app stores (FDroid? Amazon?) that steals email accounts and auth tokens. Looks like it only works on the older Android 4 Jellybean software (and some Android 5 Lollipop) and below, so mostly concentrated in Asia where there are lower-end phones.
You can see if your account has been affected here:
Thanks for making this comment. This post is a wonderful example of the rampant marketing that has given the security industry a bad name.
- The title is technically accurate, which is the best kind of accurate for clickbait. This is not a novel vulnerability representative of an application security flaw within Google - the malware campaign specifically targets older devices using previously known vulnerabilities.[1] There is no new exploit research here.
- There's a logo and cute name for something which is, again, not a novel vulnerability.[2]
- Scaremongering tactics are used throughout to hype up the finding.[3][4] Deliberately ominous language like "...for now" is perhaps tolerable when it's coming from a media outlet, but it's certainly unacceptable from a firm conducting original security research.
All things told, this is closer to "threat intelligence" than real security research. A much better source for this news is the blog post by Google's Director of Android Security, Adrian Ludwig (first footnote, linked elsewhere in this thread as well). In particular, notice the succinctness and the serious, yet detached professionalism associated with the post.
In any case, there are legitimate arguments to be made in favor of extending software or device support lifetimes for vulnerability patches, but the onus is on device manufacturers to coordinate this. In the meantime, it would be great if fewer firms practiced this sort of manic self-promotion, but unfortunately there's little incentive not to.
That's not to say Google has no responsibility in this. Google's OS has a terrible security-update policy. Being able to buy a new computing device from a store that will receive no security updates is terrible, and is fairly common in Android devices.
Now, there are valid technical reasons that Google can't be as good as Microsoft at pushing out updates to every device running their OS, but still, it's hard to say that Google has made fixing this problem a priority. Even their own 1st-party devices have a pathetic 2-year upgrade window from launch, which, I'll remind you, still means somebody can buy last year's device on a store shelf and stop getting security updates before the device is even out of warranty.
Google in most cases is not the device's manufacturer, and in a mobile device OS and application SW are tightly coupled, so you cannot really have OS updates separate from firmware updates, bypassing the actual manufacturer's own updates.
Google (and Apple and Microsoft) can totally do it for devices that manufactures and maintains on its own, and actually it is one of the selling points of their new phone.
In addition, at least in the US, devices connecting to a cellular network have to go through a certification process, enforced by the carriers, and every firmware update requires additional regression testing (time and money) for the device to be admitted on the network again.
(That's my guess of why Lenovo dropped the policy of fast updates for Moto phones when they bought Motorola)
Google have an approval system before they allow a manufacturer to bundle Google services. Update mechanisms could easily be built into that approval process. I suspect that they haven't turned the screws too hard on manufacturers for fear of Samsung or LG making an Amazon-style fork.
Google have already drawn their own roadmap with the Android One project - a number of low-end manufacturers have devices that get updates directly from Google. They could easily create a Nexus-type brand that manufacturers can opt into, guaranteeing timely updates and long-term support. This could be a big draw for mid-tier manufacturers.
Exactly. Apple only has one set of hardware to support. Microsoft support the PC platform and you can usually do a fresh install on any machine and it will boot (driver support is a little different).
Android is garbage in this regard. Google binds everyones' feet with the OHA so they are required to use the Google Play Store and services (and they can also never manufacture Amazon devices) yet they don't standardize the system to ensure that AOSP can install anywhere. Part of this is the difficulty of ARM not really being an architecture, but even Microsoft was able to deal with this by requiring UEFI and some standardization on Windows devices (although they're more like Apple where there's limited hardware to support).
Google makes a ton of money from their licensing. It's in their advantage that people buy new phones all the time. If the hardware wasn't all over the place, we'd see more uptake for thinks like Plasma.
What are the alternatives right now for software devs that are willing to do their own roll-your-own work? Ubuntu Touch doesn't seem to have been updated for most of their ports in forever. Plasma supports two devices, neither of which have sdcard slots.
> Part of this is the difficulty of ARM not really being an architecture, but even Microsoft was able to deal with this by requiring UEFI and some standardization on Windows devices (although they're more like Apple where there's limited hardware to support).
Well, the reason why PCs are a standardized platform is because the industry was built around cloning the AT. If anything wanted to be successful, it had to do everything the AT did the way the AT did it. Once the AT started getting long in the tooth, the industry got together to agree on further standards like the ISA bus (an extended version of the AT's bus), various ATA storage standards (again, derivative of the AT's storage protocol), the ATX form factor, the PC System Design Guide (PC 97/98/99/2001), etc.
The PC industry has a culture of working together and collaborating for the sake of compatibility. For part of this, Intel was involved with the standardization process (something ARM refuses to do), but even they didn't have the full authority to force anyone to adopt their standards. In fact, there were plenty of manufacturers of x86 machines who decided to skip out on PC compatibility entirely in order to do their own thing. Just go ahead and try to install Windows 3.1 on a WonderSwan, for example. It's x86, but not a PC or any kind of AT clone. The industry simply declined to see it as a PC and moved on.
It's a shame that nobody in the phone industry every attempted a hardware standardization effort. Google and Qualcomm could've worked together to come up with some real standards, but they dropped the ball.
I have to wonder if Google would've been able to force something through if they made Android run entirely on native code instead of shoving everything into a Java-based VM. If the industry couldn't take the shortcut of "let's just port Dalvik to our hardware and call it a day" and instead had to ensure compatibility for a wide array of native software, they might actually have developed some form of collaborative discipline.
What Google needs to do now is collaborate with Qualcomm and come up with their own standardized hardware platform. Create a phone equivalent to PCI, ATX, PC 98, etc. And then refuse to license Android to any device that isn't built on this platform. They should complete the process of moving AOSP into GApps, replace the Linux kernel with a closed-source BSD derivative, and then announce the closure of AOSP. They should do with Android exactly what Microsoft does with desktop Windows.
>What Google needs to do now is collaborate with Qualcomm and come up with their own standardized hardware platform. Create a phone equivalent to PCI, ATX, PC 98, etc. And then refuse to license Android to any device that isn't built on this platform. They should complete the process of moving AOSP into GApps, replace the Linux kernel with a closed-source BSD derivative, and then announce the closure of AOSP. They should do with Android exactly what Microsoft does with desktop Windows.
Making Android closed source won't do anything. Unless you're using Fire, all Android manufacturers in the US are not getting their code through AOSP. They get it through a side license with Google.
The problem is not that Google can't force security, it just doesn't want to.
That is specifically a design flaw in the AOSP. Right now manufacturers have to integrate their custom device drivers into every new OS build, leading to long delays and fragmentation. The device drivers should be separate, and the OS should expose a stable API and integration points. That way OS upgrades could be pushed out without breaking everything, just like with desktop OSs.
That's not Google's policy, it's Linux. What you suggest would mean abandoning Linux as a kernel. I'd be all for it - the industry needs some more open source kernel competition.
Linux is a bad choice for a half-open environment. Either it's totally open (like most Linux distros) or everything is done by the manufacturer (like routers). But not this garbage with closed source drivers that will prevent you from recompiling eventually somewhere in the future.
The basics are that every phone out there uses a forked Linux kernel patched to hell to get it working. Since none of the drives are upstreamed it's unmaintainable.
The linux kernel does not have a stable driver interface so shipping updates to phones is a LOT of work.
This doesn't explain why they can't upgrade user-space applications and libraries. It's very rare for a user-space application upgrade to require a kernel update on any major operating system.
> and in a mobile device OS and application SW are tightly coupled
I call bullshit. There's no reason Google can't update everything AOSP-y in /system - libc, libart, libwebkit etc.
That's not the point. Even if it were so, it's still responsibility of the manufacturer to integrate it in its own firmware and push the update with the carrier's approval.
You are comparing a laptop to a smartphone, which makes no sense, the smartphone has to connect to cellular network to be useful, and it's the carrier that establishes the rules for the update process.
I agree that it should work as you say for devices with no cellular connectivity, such as WiFi only tablets, where no other parties other than the OS and device manufacturers are involved.
Right now, Google has no credible open competitor to Android, and not for lack of trying. If Android wants to be the Windows to iPhone's Mac, it will have to get serious about security, or be swept away by the competitors which will inevitably emerge.
I also want to say that voting machines run unsupported Android builds. If Google is derelict in that duty... well, that's a much bigger deal than some compromised Google accounts.
I never heard that about voting machines. Do you have a source for that? I'm not sure why that's more surprising than hearing that they run Windows XP...
The Android device in use is the EA Tablet. The certification tests are listed in "EA TABLET FOR ANDROID WITH JELLYBEAN 4.2.1 ELECTRONIC Test Report," dating from 2013.
To be fair, it's probably the best of the horrible lot in security, but that ain't saying much.
For example, the iVotronic systems contain a readily accessible compact flash card right on the top, which stores the election returns. Demonstration machines are set up in each county, so I went to see one in person. Unsurprisingly, the demo machine's card wasn't even covered with a tamper-evident seal.
The devices, including the compact flash cards and the PEBs, are reused from year to year because the legally required certification for the device is very narrow. As the demo machine compact flash cards and PEBs are re-used in each election, at any time prior to the election, infecting the demo machine can be used as a vector to attack the entire county voting total.
Since the demo machine is not sealed, its compact flash can be accessed. If the compact flash card is compromised, the system can be quickly owned. From there, the malware can spread rather trivially to the PEB unit used as a secure token by the election workers, and from there to the county's Unity system at Election Central, allowing the entire county's vote to be altered. So instead of the 4,500 machine compromises PA is claiming would be necessary to influence a state election, it would probably only take 6-7 people any time in the past ten years planting their malware in a few key counties.
All one would need to do to untraceably change the vote totals would be walk in to the county election commission, swap the compact flash out for your malware, and leave. If you do this at any point prior to the election, the malware can spread from the demo machine, to a live voting machine, and finally, when the compact flash cards are entered into the Unity system for final tally, the malware can compromise the whole lot. Then the malware would self-delete, leaving no reliable paper audit record.
Interestingly, from a legal perspective, the Secretary of the Commonwealth's certification for these machines is contingent upon the locking mechanism preventing access to the compact flash card. The machine that I saw, the most common model in use in the state, physically could not be secured that way. The plastic cover mechanism to which the lock is affixed simply doesn't cover the flash card slot well enough.
Under the PA election code, if a specific requirement of the Secretary's certification is not met, the law would invalidate the votes cast through all the iVotronics as a matter of law. As the machines were not configured as approved, they aren't approved for casting ballots, which would throw the PA recount into chaos. It's probably the only judicial avenue left to sue for a state-wide recount that might actually have a chance of being considered.
Nobody tell Jill Stein. In all liklihood, the PA legislature would just send the current electors anyway, as is their prerogative.
Now I really want to post this to /r/politics or one of the jill stein subs, with a title like "PENN VOTING MACHINES COULD HAVE EASILY BEEN HACKED, THE VOTES ARE INVALID".
It would get upvoted, perhaps to the front page, and then news outlets would likely pick up the story.
The iVotronics hacking part is very public. The legal aspect may not be as well-known.
The iVotronics vulnerabilities were documented in a lawsuit joined by the Commonwealth's own Deputy Commissioner of Elections. See Banfield v. Cortes [0]
The Election Code specifies that the Secretary of the Commonwealth shall certify electionic voting systems, and issue directives and instructions upon which such approval is conditioned, with which counties are required to comply.
§ 3031.5. Examination and approval of electronic voting systems by the Secretary of the Commonwealth
(a) The Secretary of the Commonwealth may issue directives or instructions for implementation of electronic voting procedures and for the operation of electronic voting systems....
The county board shall comply with the requirements for the use of the electronic voting system as set forth in the report by the Secretary of the Commonwealth...
(c) No electronic voting system not so approved shall be used at any election... [1]
The Secretary alone determines the method of certification.
While the Legislature mandated that an electronic voting system must comply with specific federal testing and performance standards and the requirements set forth in the Election Code, it does not prescribe a particular testing procedure to govern the manner in which the Secretary is to perform the examination, but ultimately left this discretion to the expertise of the Secretary, who is tasked with implementing the Election Code. [0]
However, counties must still comply with the implementation "directives and instructions" issued by the Secretary.
Section 1105-A of the Election Code, 25 P.S. § 3031.5 requires that the Secretary of the Commonwealth examine all electronic voting system used in any election in Pennsylvania and that the Secretary make and file a report stating whether, in her opinion, the electronic voting system can safely be used by voters and meets all of the applicable requirements of the Election Code...
The Secretary of the Commonwealth certifies the iVotronic Voting System in accordance with the conditions detailed in the reports... and the following conditions. [2]
The certification of the iVotronics system implemenation directives and instructions include a the specific provision that counties "must install the locking mechanism over the serial port and compact flash memory in a manner to prevent access to the compact flash card."
3. Pennsylvania counties using the iVotronic Voting System must install the locking mechanism over the serial port and compact flash memory in a manner to prevent access to the compact flash card. [ibid.]
As the construction of the locking mechanism itself renders the compact flash accessible regardless of the physical lock used, as determined by multiple audits in academia as well as other states, the iVotronics system in question was not certified in accordance with the requirements of the statute.
The Secretary put a caveat on the certification of the iVotronics with which the counties did not comply. This is physically analogous to requiring that a tenant "must install a lock on this door which prevents access to the inside of this room," but the door cannot latch no matter which lock is used. If instead of repairing the latching mechanism, the tenant merely replaces the lock, he would not be in compliance with the directive.
Rather than work with the manufacturers to create a locking mechanism that complied with the Secretary's directive (changing the latch), the counties merely changed the locks used. These locks do not prevent access to the compact flash card, and thus Secretary's implementation requirements were not met by the counties which used them. The counties failure to meet these directives was not due to lack of ability, as the requirements of the iVotronics maintenance contracts include modifications necessary to comply with state law, or lack of knowledge, as they were disclosed during Banfield, cited by the certification report itself.
The counties simply failed to ensure that the locking mechanisms were updated subsequent to the Secretary's report. Each county board is required to submit their vote totals to the Commonwealth in accordance with the Election Code. As the Election Code requires counties to comply with the Election Code, a county's failure to meet the Secretary's certification requirements disqualifies its reported vote totals.
It's pretty telling about the seriousness of the recount effort that nobody has even bothered to sue a county that used these machines. The Commonwealth's Election Code is not a mere recommendation to the county. Its provisions regarding DREs are specifically intended to punish counties that do not comply with the Secretary's requirements for certification, which many did not.
But please don't cross post me. You won't accomplish anything, except maybe landing me in DHS lockup for ten days for no reason.
> Being able to buy a new computing device from a store that will receive no security updates is terrible, and is fairly common in Android devices.
This seems like the kind of problem the free market could solve. Just get one phone vendor to guarantee secruity updates for a few years and then some customers will start buying those phones. After a while other vendors will start promising it or losing sales.
> This seems like the kind of problem the free market could solve.
It's the kind of problem solved by perfect market where all actors were rational, had access to complete information, and correctly prioritized their long term and short term needs.
Alas, the world we live in is seven billion highly distracted primates who interact by wiggling their smallest appendages on grids of buttons and pushing streams of air over a weird blob of muscle located inside an organ also used for food consumption.
The underlying assumption is that a multitude of users would switch to devices produced by such a manufacturer. This, I think, overestimates how much most users currently care about security.
As it turns out, there are more secure devices in the marketplace than the affected phones, but they cost more. All other things equal, a contractual obligation for security policies would increase the cost (and thus price) of devices, and users would likely stick with cheaper options.
>This, I think, overestimates how much most users currently care about security.
The media has failed to inform the lay public about this issue. Users could be made to care about security with the right messaging. Your average user may not understand OS updates but the issue can be phrased simply in terms of product defects which the manufacturer refuses to fix and that put their personal info at risk.
Blackberry android phones does this. They patch phones the same day google nexus is patched or even earlier for beta program users. Still no one is buying them
No it doesn't, because most everyday users don't give a toss about security. It has to be something that is pushed as a best-practice by those who know better, not something that is demanded by an everyday user who doesn't. The invisible hand won't do shit here.
That economic fiction requires an ideal rationale actor and a different time horizon.
1) Noble price researchers (Kahnemann & Tversky) showed that economic actors are not rationale.
2) Taking a long term view tends to require sufficient funding to allow to worry about the long term. People with lower level funds intensely worry about the short term and for them this is totally rationale.
Checkpoint has been notorious for this kind of exaggerated marketing, especially within the past few years. My theory is that their security appliance line has been suffering due to superior competitors (source: personal experience; could be wrong without global sales numbers), so I think they're trying to get their name back in people's minds.
This research is definitely good and beneficial, but yes, it's threat intelligence research, not vulnerability research or any sort of revelation. Definitely not deserving of a logo.
Years and years ago I went out searching for UMD devices vulnerable to CSRF.
They universally were, but the only vendor that responded to my email was CheckPoint, who admitted it and said they were working on a fix. (They had the fix released soon, too.) Everyone else was 100% silent treatment.
What % of the devices you tested were security appliances? It's good Checkpoint did that, but you would expect most security companies to care a lot more about patching their devices relative to all of the other various network device manufacturers out there.
I'd argue the opposite. It's this dismissive attitude about a vulnerability affecting 1M+ accounts that is the problem in some sectors of the security industry.
You're drawing arbitrary lines around what Google is responsible for and what the user is responsible for, and ultimately blaming the user for having an "older device". But guess what? This problem doesn't affect iOS products anywhere nearly as much, even though there are hundreds of millions of older devices in use. That's because Apple took an approach that allowed them to ensure devices stay up to date. Google didn't. And that's as important to security as is UX design and all the other often-dismissed factors that go into achieving successful security outcomes.
>the malware campaign specifically targets older devices using previously known vulnerabilities.[1] There is no new exploit research here.
That is a false statement where you are implying a certainty that has not yet been established. A Google employee says that as far as they have been able to investigate, several variants use known vulnerabilities.
In any case, there is indeed a giant security flaw when an application executing in a supposed sandbox can get root access. It points to severe design flaws in the application model as well as the underlying OS. The fact that elevation of privilege is almost expected should be unacceptable. Given the terrible state of security updates in Android, I would say it is worth drawing as much publicity as possible to these events.
The fact that most consumers aren't aware they most Android devices are susceptible to these kind of vulnerability argues for more noise about these issues - not calming press releases talking about how the issues are moot with the latest build.
>Malware on your Android device picked up from third party app stores
They say that, but then Google's G+ post[1] says "These apps are most often downloaded outside of Google Play"
You could read "most often" as "some of these were downloaded from Google Play".
Either way, they are exploiting known vulnerabilities. The big issue to me is that phone manufacturers / carriers, by choice, stop patching phones whenever they please.
- Removing apps from Play: We’ve removed apps associated with the Ghost Push family from Google Play. We also removed apps that benefited from installs delivered by Ghost Push to reduce the incentive for this type of abuse in the future. Downloading apps from Google Play, rather than from unknown sources [https://goo.gl/9rqdiH], is a good practice and will help reduce the threat of installing one of these malicious apps in the future.
Adrian also stated they were removing affected apps from the Play Store. Many of the more recent vulnerabilities even show up in the Play Store first.
Adrian's constant defense hinges on saying "stick with the Play Store, where we protect you", but the Play Store really isn't much better, it's just that saying it is scares people from looking at competitors' markets.
It is not likely that this malware is picked up through F-Droid as that 'store' only contains software which was built from source by the store maintainers. Any non-free code is removed from the build before the package is hosted on the download server.
As such an Android device can be used (in a useful way) without having Google Play services (or, for that matter, any other Google apps) installed. I've been doing just that for more than 5 years now without having the feeling I'm missing out on something.
AOSP or a tailor-made Cyanogenmod (with all the Cyanogen-account related stuff removed) plus F-Droid gives you a perfectly usable device.
Do you actually know which stores they mean? I'd hate for F-Droid to be vilified. F-Droid isn't just a store, it's an Android Repository Browser[1]. It would be a shame if the F-Droid repository was exploited beyond the concessions[2] that they allow.
I do not, just wanted to throw a couple that I know of out there. Hopefully neither of those third party stores because I like and use them both. I hope it was clear from the question marks in my post that those were just examples, certainly don't want to smear either one.
If you're going to name app stores, I would think places like Baidu would be more likely, given their size and popularity with users of lower-tier Android devices.
>Looks like it only works on the older Android 4 Jellybean software (and some Android 5 Lollipop) and below, so mostly concentrated in Asia where there are lower-end phones.
Yep, I'm stuck on 4.4.2 because verizon doesn't provide OTA updates any more and the 4.4 update ensures that the phone bricks if you go through the process of installing cyanogenmod.
If it is just auth tokens instead of email password, should google be able to invalidate all these auth tokens in their backend immediately? Force those uses to re-login and get new auth tokens?
The malware is still installed and would just capture the new auth tokens. And forcing the user to login would also give the malware an opportunity to capture the actual password.
Google should log other signatures such as device id, ip, network, region where the request coming from and use those data as additional layer of security in the backend to help id the folks/org behind hack.
You can see if your account has been affected here:
https://gooligan.checkpoint.com/