LockWatch, PhotosLock, Snoverlay, and other jailbreak tweaks to check out this weekend

Cydia isn’t quite bustling right now, but we’ve seen some exciting activity in the community recently by way of semi-jailbreaks and exploits that could potentially lead to something exciting on iOS 11 in the future. While there’s no guarantee of anything special coming out of it, we can still cross our fingers and hope for the best.

Whether current versions of iOS will receive a jailbreak or not continues to baffle the best of us, but those sporting iOS 10 devices jailbroken with Yalu can still enjoy all of Cydia’s latest releases. In this roundup, we’ll talk about our favorite jailbreak tweaks launched during the past week. As usual, we’ll begin by showcasing our favorites and outline the rest of them afterward.

Read the rest of this post here


LockWatch, PhotosLock, Snoverlay, and other jailbreak tweaks to check out this weekend” is an article by iDownloadBlog.com.
Make sure to follow us on Twitter, Facebook, and Google+.

Apple Hid A Holiday Season Easter Egg In The Apple Store App, Here’s How To Find It

Apple has slipped a little Easter egg into a new Apple Store app update that has been made available via the App Store. Here’s how to find it.

[ Continue reading this over at RedmondPie.com ]

“They All Stink” Says Apple’s Phil Schiller About Android’s Face ID Competitors

When speaking about the competing facial recognition technologies used by Android smartphone makers, Apple’s Phil Schiller took the opportunity to stick the knife in, twist it, and then do it all over again. According to the man himself, “they all stink.”

[ Continue reading this over at RedmondPie.com ]

How to Show Battery Percentage on iPhone X

If you have an iPhone X you may have noticed that the battery percentage indicator is not an option to enable in the device settings. This is probably because the prominent notch at the top of the screen, which houses the front facing camera and the phones ear speaker, does not allow for sufficient room … Read More

VLC for iOS gains full support for 4K H.265 video & iPhone X display

VLC, the popular cross-platform media player, was updated on App Store this morning with support for the native OLED display resolution on iPhone X and the ability to play 4K video encoded in the High Efficiency Video Coding standard (the H.265 codec)…. Read the rest of this post here


VLC for iOS gains full support for 4K H.265 video & iPhone X display” is an article by iDownloadBlog.com.
Make sure to follow us on Twitter, Facebook, and Google+.

As Bitcoin surges, so does the Coinbase app in App Store

The price of Bitcoin has reached insanely ridiculous levels in the past few weeks, and no one knows exactly why. And as the popular blockchain cryptocurrency continues its surge, the Coinbase app for iPhone has now risen to the top of the free apps chart in App Store…. Read the rest of this post here


As Bitcoin surges, so does the Coinbase app in App Store” is an article by iDownloadBlog.com.
Make sure to follow us on Twitter, Facebook, and Google+.

How to Use the One Handed Keyboard on iPhone

The latest iOS versions support one-handed keyboard mode for iPhone. One Handed Keyboard shifts the touch screen keys over on the screen to the left or to the right, so that it’s theoretically easier to reach the keys with a single thumb. This keyboard feature can be particularly helpful for users who have the larger … Read More

How to Disable Emergency SOS on iPhone X to Stop Accidentally Dialing 911

The iPhone X offers an Emergency SOS feature which will automatically dial 911 when the devices side buttons are held down for several seconds. The Emergency SOS countdown then starts blaring an alarm and counts down from 3, 2, 1, before dialing emergency services on your behalf, thanks to a feature called Auto Call. While … Read More

iPhone or iPad Crashing to a Black Screen as of Dec 2? Here’s How to Fix

Is your iPhone or iPad crashing repeatedly to a black screen as of December 2? The crash is usually seen by the end user as a sudden appearance of a black screen with a spinning wheel cursor, and then you must enter your passcode to use the device again. If the bug is particularly bad, … Read More

iOS 11.2 Download Released, Update Now [IPSW Links]

Apple has released iOS 11.2 for iPhone, iPad, and iPod Touch devices. The update includes important bug fixes, including a resolution for a date related bug which causes some iPhones to crash repeatedly, along with support for a new feature called ApplePay Cash. The software update is recommended for all iOS 11 users to install … Read More

How to Disable Auto HDR on iPhone X and iPhone 8 Camera

The latest iPhone models from Apple default to automatically enabling HDR on the device camera, this includes iPhone X, iPhone 8 Plus, and iPhone 8. HDR can often create better looking pictures by blending color range from different exposures into a single image, but it can also sometimes make pictures look strange or even worse, … Read More

Direct IPSW Download Links for iOS 11.0 Public Release [iPhone, iPad And iPod touch]

Apple has released the final version of iOS 11 and it is available for download for all supported models of iPhone, iPad and iPod touch. If you are unsure whether your device is supported or not, then check our iOS compatibility list here. There are several ways of installing iOS 11 on an iOS device, […]

The post Direct IPSW Download Links for iOS 11.0 Public Release [iPhone, iPad And iPod touch] appeared first on iOS Hacker.

Joining Apple

I’m pleased to announce that I’ve accepted a position with Apple’s Security Engineering and Architecture team, and am very excited to be working with a group of like minded individuals so passionate about protecting the security and privacy of others.

This decision marks the conclusion of what I feel has been a matter of conscience for me over time. Privacy is sacred; our digital lives can reveal so much about us – our interests, our deepest thoughts, and even who we love. I am thrilled to be working with such an exceptional group of people who share a passion to protect that.

Technical Analysis: Meitu is Crapware, but not Malicious

Last week, I live tweeted some reverse engineering of the Meitu iOS app, after it got a lot of attention on Android for some awful things, like scraping the IMEI of the phone. To summarize my own findings, the iOS version of Meitu is, in my opinion, one of thousands of types of crapware that you’ll find on any mobile platform, but does not appear to be malicious. In this context, I looked for exfiltration or destruction of personal data to be a key indicator of malicious behavior, as well as performing any kind of unauthorized code execution on the device or performing nefarious tasks… but Meitu does not appear to go beyond basic advertiser tracking. The application comes with several ad trackers and data mining packages compiled into it – which appear to be primarily responsible for the app’s suspicious behavior. While it’s unusually overloaded with tracking software, it also doesn’t seem to be performing any kind of exfiltration of personal data, with some possible exceptions to location tracking. One of the reasons the iOS app is likely less disgusting than the Android app is because it can’t get away with most of that kind of behavior on the iOS platform.

Over the life span of iOS, Apple has tried to harden privacy controls, and much of what Meitu wishes it could do just isn’t possible from within the application sandbox. The IMEI has been protected since very early on, so that it can’t be extracted from within the sandbox. Unique identifiers such as the UDID have been phased out for some years, and some of the older techniques that Meitu’s trackers do try and perform (such as using the WiFi or Bluetooth’s hardware address) have also been hardened in recent years, so that it’s no longer possible.

Tracking Features

Some of the code I’ve examined within Meitu’s trackers include the following. This does not mean these features are turned on, however many features appear to be managed by a configuration that can be loaded remotely. In other words, the features may or may not be active at any given time, and it’s up to the user to trust Meitu.

  1. Checking for a jailbreak. This code exists in several trackers, and so there are a handful of lousy, ineffective ways that Meitu checks to see if your device is jailbroken, such as checking for the presence of Cydia, /etc/apt, and so on. What I didn’t find, however, was any attempt to exploit the device if it was found to be jailbroken. There didn’t appear to be any attempts to spawn new processes, invoke bash, or exfiltrate any additional data that it would likely have access to on a jailbroken device. Apple’s App Review team would have likely noticed this behavior if it existed, also. Apple 1, Meitu 0.
  2. Attempts to extract the MAC address of the NIC (e.g. WiFi). There were a few different trackers that included routines to extract the MAC address of the device. One likely newer tracker realized that this was futile and just returned a bogus address. Another performed the sysctl calls to attempt to obtain it, however the sandbox would similarly return a bogus address. Apple 2, Meitu 0.
  3. Meitu uses a tool called JSPatch, which is a very sketchy way of downloading and executing encrypted JavaScript from the server. This is basically an attempt to skirt iOS’ unsigned code execution by downloading, decrypting, then eval’ing… but isn’t quite evil enough that Apple thinks it’s necessary to forbid it. Nonetheless, it does extend the functionality of the application beyond what is likely visible in an App Store review, and by using Meitu you may be allowing some of its behavior to be changed without an additional review. No points awarded to either side here.
  4. The various trackers collect a number of different bits of information about your hardware and OS version and upload that to tracker servers. This uses your network bandwidth and battery, so using Meitu on a regular basis could consume more resources. There wasn’t any evidence that this collection is done when the application is in the background, however. If the application begins to use a lot of battery, it should gravitate towards the top of the battery usage application list. Apple 3, Meitu 0.
  5. Code did exist to track the user’s location directly, however it does not appear to be active when I used it, as I was never prompted to allow access; if it does become active, iOS will prompt the user for permission. Apple 4, Meitu 0.
  6. Code also existed to indirectly track the user’s location by extracting location information from the EXIF data in images in the user’s photo album. Any time you take a photo, the GPS position where the photo was taken is written into the metadata for the picture, and other applications have access to read that (if they’re granted access to the photo album or camera). This can be aggregated to determine your home address and potentially your identity, especially if it’s correlated with existing data at a data mining company. This is a very clever way to snoop on the user’s location without them being prompted. It was not clear if this feature was active, however the hooks did exist to send this data through at least some trackers compiled into Meitu, and appeared to include MLAnalytics and Google AdMob trackers. Apple 4, Meitu 1.

Other Observations

  1. Code existed to use dlopen to link directly to a number of frameworks, which can often be used by developers to invoke undocumented methods that are normally not allowed by the App Store SDK. Chronic reported this in his own assessment, but indicated that it was never called. I have since discussed some of my findings with him – namely, suspicious long jumps in the code involving pointer arithmetic that indicate the calls may have been obfuscated. It is very likely, however, that these calls no longer work in recent versions of iOS due to ASLR. The entire issue is a moot one anyway, as I’ve been informed that weak linking in this fashion is now permitted in the App Store, so long as the developer isn’t using it as a means to call unsupported methods. I did not see evidence of that happening.
  2. Meitu does obtain your cellular provider’s name, using an authorized framework on the device, as well as observes when that carrier changes (possibly to determine if you’re switching between WiFi and LTE). This appears to be permitted by Apple and does not leak information beyond what’s stated here.
  3. Code was found that makes private framework calls, but as Chronic pointed out, they no longer work. This was likely also old code lying around from earlier versions of the app or trackers.

A number of these trackers were likely written at different times in the iOS life cycle, and so while some trackers may attempt to perform certain privacy-invading functions, many of these would fail against recent versions of iOS. A number of broken functions no longer used likely also were at one point, until Apple hardened the OS against them.

Summary

Meitu, in my opinion, is the quintessential data mining app. Apps like this often provide menial functionality, such as fart and flashlight apps do, in order to get a broad audience to use them and add another data point into a series of marketing databases somewhere. While Meitu denies making any money off of using these trackers, there’s very little other reason in my mind to justify seeing so many built into one application – but that is a judgment call for the user to make.

Because of all of the tracking packages baked in, Meitu is a huge app. I cannot vouch for its safety. There may very well be something malicious that I haven’t found, or perhaps something malicious delivered later through their JSPatch system. It’s a big app, and I’m not about to give them a free complete static binary analysis.

At the end of the day, using Meitu isn’t likely to adversely affect your system or steal your data, however it’s important to understand that there is a fair bit of information that could be used to track you as if cattle in some marketing / data mining system used by advertisers. Your adversary here isn’t China, it’s likely the department store down the street (or perhaps a department store in China), but feel free to insert your favorite government conspiracy theory here – it could possibly be true, but they have better ways to track you. If you don’t mind being tracked in exchange for giving yourself bug eyes and deleting your facial features, then Meitu might be the right app for you.

Three Recommendations to Harden iOS Against Jailbreaks and Malware

Apple has been fighting for a secure iPhone since 2007, when the first jailbreaks came out about two weeks after the phone was released. Since then, they’ve gotten quite good at keeping the jailbreak community on the defensive side of this cat and mouse game, and hardened their OS to an impressive degree. Nonetheless, as we see every release, there are still vulnerabilities and tomhackery to be had. Among the most notable recent exploits, iOS 9 was patched for a WebKit memory corruption vulnerability that was used to deploy the Trident / Pegasus surveillance kit on selected nation state targets, and Google Project Zero recently announced plans to release a jailbreak for iOS 10.1 after submitting an impressive number of critical vulnerabilities to Apple (props to Ian Beer, who should be promoted to wizard).

I’ve been thinking about ways to harden iOS against jailbreaks, and came up with three recommendations that would up the game considerably for attackers. Two of them involve leveraging the Secure Enclave, and one is an OS hardening technique.

Perfect security doesn’t really exist, of course; it’s not about making a device hack proof, but rather increasing the cost and time it takes to penetrate a target. These ideas are designed to do just that: They’d greatly frustrate and upset current ongoing jailbreak and malware efforts.

Frustrating Delivery and Persistence using MAC Policy Framework

The MAC Policy Framework (macf) is a kernel-level access control framework originally written into TrustedBSD, and made its way into the xnu kernel, used by iOS and macOS. It’s used for sandboxing, SIP, and other security functions. The MAC (mandatory access control) framework provides granular controls over many aspects of the file system, processes, memory, sockets, and other areas of the system. It’s also a component of the kernel I’ve spent a lot of time lately researching for Little Flocker.

Rooting iOS often requires getting kernel execution privileges, in which in most cases all bets are off – you can, of course, patch out macf hooks. Gaining kernel execution, however, can be especially tricky if you’re depending on an exploit chain that performs tasks that you can thwart using macf, before you get your kernel code off the ground. It can also force an attacker to increase the size and complexity of their payload in order to successfully disable it, all which take time and increase cost. Should an attacker still succeed, disabling macf will leave sandboxes and a number of other iOS features broken, which jailbreaks want to leave intact. In short, it would require a much more complex and intricate attack to whack macf without screwing up the rest of the operating system.

For example, consider a kernel level exploit that requires an exploit chain involving writing to the root file system, injecting code into other processes (task_for_pid), loading a kext, or performing other tasks that can be stopped with a macf policy. If you can prevent that task_for_pid from ever happening, then that exploit chain might not be able to get off the ground to make the rest possible. Should the attack succeed in spite of this added security, you’ve now forced the attacker to go digging pretty deep in the kernel, find the right memory addresses to patch out macf, and invest a lot of time to be sure their jailbreak doesn’t completely break application sandboxing or other features. In other words, it takes a lot of work to break macf without also breaking third party apps and other features of the iPhone. Adding some code to sandboxd to test macf would also be extra gravy; if macf is compromised and that causes sandboxd to completely break, the user is going to notice it and perhaps find their phone unusable (which is what you’d want if a device is compromised).

Apple understands that if you can keep an exploit chain from getting off the ground, you can frustrate attempts to gain kernel execution. For example, Apple mounts the root partition as read-only; it’s trivial to undo this, as is demonstrated by any jailbreak. All you need is root – not even kernel.  But what about macf? Using the MAC Policy Framework can prevent any writes to the root file system at a kernel level, and can even prevent it from being mounted as read-write except by a software update. MAC is so well written that even once you’re in the kernel, opening a file (vnode_open) still invokes macf hooks; you’ll have to go in and patch all of those hooks out first in order to disable it. This means that lower down on your exploit chain, your root user won’t be able to gain persistence without first performing surgery in the kernel (and likely breaking the rest of the OS).

But wait, macf can do a heck of a lot more than just file control. Using macf, you can prevent unauthorized kexts from loading, you can prevent processes (like cycript and mobile substrate) from attaching to other processes (task_for_pid has hooks into macf), you can even prevent signals, IPC, sockets, and a lot more that could be used to exploit the OS… there’s a whole lot you can do to frustrate an exploit chain before it even gets off the ground by adding some carefully crafted macf policies into iOS that operate on the entire system, and not just inside the sandbox.

Apple has yet to take full advantage of what macf can do to defend against an exploit chain, but it could greatly frustrate an attack. Care would have to be taken, of course, to ensure that mechanisms like OTA software updates could still perform writes to the file system and other such tasks; this is trivial to do with macf.

Leveraging the SEP for Executable Page Validation

The Secure Enclave (SEP) has a DMA path to perform very high speed cryptography operations for the main processor. It’s also responsible for unwrapping class keys and performing a number of other critical operations that are needed in order to read user data. Leveraging the SEP’s cryptographic capabilities could be used to ensure that the state of executable memory has not been tampered with after boot. I’ll explain.

As I said earlier, the system partition is read only and remains read only for the live of the operating system (that is, until it’s upgraded). Background tasks and other types of third party software don’t load until after the device has booted, and usually also authenticated. That means that somewhere in the boot process is a predictable machine state that is unique to the version of iOS running on it, at least as far as executable pages are concerned.

Whenever a new version of iOS is loaded onto the device, the update process could set a flag in the SEP so that on next reboot, the SEP will take a measurement of all the executable memory pages at a specific time when the state of the machine can be reliably reproduced; this is likely after the OS has booted but before the user interface is presented. These measurements could include a series of hashes of each page marked executable in memory, or possibly other types of measurements that could be optimized. These measurements get stored in the SEP until the software is updated again or until the device is wiped.

Every time iOS boots after this, the same measurements are taken of all executable pages in memory. If a root kit has been made persistent, the pages should not match and the SEP could refuse to unlock class keys, which would leave the user at a “Connect to iTunes” screen or similar.

This technique may not work on some tethered jailbreaks that actively exploit the OS post-boot, but nobody really cares about those much anyway; the user is aware of them, root kits or malware can’t leverage those without the user’s knowledge, and the user is effectively crippling their phone to use a tethered jailbreak. It does, however, protect against code that gets executed or altered while the system is booting, including detecting kernel patches made in memory. An attacker would have to execute their exploit after the measurements are taken in order for the code to go unnoticed by the SEP.

Care must be taken to ensure that the technique to flag a software update not be reproducible by a jailbreak; this can be done with proper certificate management of the event.

Encrypt the Root Partition and Leverage the SEP’s File Keys

One final concept that takes control out of the hands of the kernel is to rely on the SEP to prevent files from being created on the root partition by encrypting the root partition with a quasi-like class key in a way that the SEP could refuse to wrap new file keys for files on the root partition. Presently, the file system’s keys are all stored in effaceable storage however if rooffs’ keys were treated as a kind of class key inside the SEP, the SEP could refuse to process any new file keys for that specific class short of a system update. Even should Trident be able to exploit the kernel, it theoretically shouldn’t be able to gain persistence in this scenario without also exploiting the Secure Enclave, as it couldn’t create any new files on the file system; it may also be possible to prevent file writes in the same way, with some changes.

Conclusion

The SEP is one of Apple’s most powerful weapons. Two of these three solutions recommend using it as a means to enforce a predictable memory state and root file system. A third recommendation could lead to a more hardened operating system where an exploit chain could potentially become frustrated, and/or require a much more elaborate kernel attack in order to succeed.

 

 

 

 

San Bernardino: Behind the Scenes

I wasn’t originally going to dig into some of the ugly details about San Bernardino, but with FBI Director Comey’s latest actions to publicly embarrass Hillary Clinton (who I don’t support), or to possibly tip the election towards Donald Trump (who I also don’t support), I am getting to learn more about James Comey and from what I’ve learned, a pattern of pushing a private agenda seems to be emerging. This is relevant because the San Bernardino iPhone matter saw numerous accusations of pushing a private agenda by Comey as well; that it was a power grab for the bureau and an attempt to get a court precedent to force private business to back door encryption, while lying to the public and possibly misleading the courts under the guise of terrorism.

Just to give you a little background, I started talking to the FBI on a regular basis around 2008, when I pushed my first suite of iPhone forensics tools for law enforcement. The FBI issued what they called a “major deviation” allowing their personnel to use my forensics tools on evidence. The tools were fast tracked through NIST/NIJ (National Institute of Justice is NIST’s law enforcement facing arm), and findings were validated and published in 2010. During this time, I assisted some of the FBI’s RCFLs (regional computer forensics labs), including the lab director for one of them, who had informed me my tools had been used to recover crucial data in terrorism and child exploitation cases. I’ve since developed what I thought was a healthy working relationship with the FBI, and have had a number of their examiners in my training classes, testified with some of them (as an expert) on criminal cases, and so on. The reason I’m giving this background is that one would have thought that when someone with this relationship with the FBI called up a few of the agents who have been working on the San Bernardino case, that they’d be interested in having my help to get into the phone.

Initially, they were. I spoke to one individual (whom I knew personally) and he had helped set up a conference call with a couple of the agents who were working on the case. This was maybe a week in advance, and very early on in the case. The meeting was scheduled, and the agenda was to discuss some details about the device and a couple potential techniques that I believed might get them into the device. One of the techniques was the NAND Mirroring approach, which I later demonstrated in a video and was later definitively proven as a viable method by another researcher from University of Cambridge. He took sort of the elegant way of doing it, but a quick and dirty dump-and-reball would have gotten the desired result too. Other techniques that we were going to discuss were possible screen lock bypass bugs that existed in the device’s operating system and collaborating possibly with a few other researchers who had submitted code execution bugs affecting that particular version of firmware. I already had a tested and validated forensic imaging process developed, so it was just a matter of finding the best way to bolt that onto our point of entry.

The day before the conference call was scheduled, it had gotten killed from powers on high. I was never given a detailed reason for it, and I don’t think my contacts knew either except that they were told they weren’t allowed to talk to anyone about the device – apparently including me, a forensics expert that had helped them to get into phones before. I don’t know if the call came down from lawyers, or if it went higher than that – it’s irrelevant, really. It was understood that nobody at FBI could talk to me about the case or even have a one-way conversation to give them a brain dump.

The reason I bring this up is that Comey’s public facing story was that “anyone with an idea” can come to the FBI and help them out. This clearly wasn’t true, and what was going on behind the scenes was quite the opposite…. and I’m not some crazy anon either approaching FBI with some crack pot solution; I had a working relationship with them, and had assisted them many times before, usually pro-bono (as I did with many other agencies). The people knew me, we had each other in our phone books, and every professional level of trust you would expect in cases such as this.

Comey’s public story about accepting help on the SB iPhone was entirely false, and he pushed hard over the next month for a court precedent. When it became evident that Comey wasn’t going to win this case in court, suddenly a solution from out of nowhere manifested. We paid a million dollars of our tax money for an unlock that FBI could have done for about $100 with the right equipment.

There were, at the time, a number of other questionable statements made by Director Comey that have led me to believe he wasn’t completely forthcoming in his testimony before Congress.

In a letter emailed from FBI Press Relations in the Los Angeles Field Office, the FBI admitted to performing a reckless and forensically unsound password change that they acknowledge interfered with Apple’s attempts to re-connect Farook’s iCloud backup service. In attempting to defend their actions, the following statement was made in order to downplay the loss of potential forensic data:

 “Through previous testing, we know that direct data extraction from an iOS device often provides more data than an iCloud backup contains. Even if the password had not been changed and Apple could have turned on the auto-backup and loaded it to the cloud, there might be information on the phone that would not be accessible without Apple’s assistance as required by the All Writs Act Order, since the iCloud backup does not contain everything on an iPhone.”

This statement implied only one of two possible outcomes:

Either they were wrong about that, and were reckless…

It is true that an iCloud backup does not contain everything on an iPhone. There is a stateful cache containing third party application data that is not intended to come off of the phone. This is where most private content such as Wickr, Telegram, and Signal databases would live. However, this information also does not come off the phone in a direct backup either. Similarly, all commercial forensics tools use the same backup facility as iTunes for iOS 9, meaning none of them can get the stateful cache either.

The backup conduit provides virtually the same data as an iCloud backup. In fact, an iCloud backup arguably provides more data than a direct logical extraction because they are incremental, and contain older backups. Desktop backups can sometimes even contain less content, as they exclude photos that have already been synced to iCloud. There are a few small exceptions to this, such as keychain data, which will only come off the phone in a direct backup if backup encryption is turned on. Ironically, if Farook’s phone has backup encryption turned on (which is likely), the FBI won’t be able to get anything at all from a direct copy, because the contents will be encrypted. Even if they found the device to have backup encryption off (and turned it on), they’re still not going to get the data they actually need off of the device (e.g. the cached third party application data); getting passwords doesn’t mean much when you can just subpoena every content provider for the data anyway.

…or the government wanted to compel more assistance, and mislead the courts about it.

As I said, there is in fact more data available on the device than comes off in any backup. The only way to get to this data, however, would be for Apple to digitally decrypt and extract the contents of the file system, and provide them with a raw disk image. This is similar to what Apple had done in the past, except they would now also have to write a new decryption and extraction tool specifically for the new encryption scheme that was introduced in iOS 8, and carried into 9.

This second possibility is far more sinister than simply being wrong about the quality of iCloud data. If the government actually did intend to get a hold of this “extra” data that only Apple can provide, then that means they would be following their original AWA order with a second AWA order, requiring Apple to build a tool to decrypt and extract this content from the device. Their original order required Apple to build a backdoor brute force tool. It did not require Apple to perform any kind of extraction of the raw disk for them. If a second order was in the works, this would have meant two important things:

  1. The attorneys for the FBI provided an incomplete, and misleading explanation of assistance to the courts, which intentionally hid the extra assistance that Apple would later be required to provide in order to finish this task – assistance which, when combined with the original list of work, may have been considered unreasonable by the court.
  2. Requiring Apple to break into and image the phone for them anyway would completely obsolete the necessity of designing a backdoor tool from the first order, but would have gotten them their encryption precedent for future use.

In other words, if the FBI had been planning to have Apple perform a physical extraction of this extra data, as seems hinted by in the FBI’s comments, then they were forcing Apple to create this backdoor tool for an undisclosed reason. It would also mean that all of this extra work was being hidden from both the courts and from Apple, possibly because the combination of the two AWA orders would have constituted “unreasonable” assistance in the court’s view. It completely modified the purpose of the first order as well; we’ve now gone from having a single tool with a very specific purpose to having two separate tools to create a modular platform for the government to use (via the courts) as each piece becomes needed. The middle overlap for these two components would have been entirely redundant and useful only for a law enforcement agency looking for a modular forensics toolkit at their disposal, and such work would never have been necessary if Apple simply broke the PIN and delivered a disk image as a lab service.

The motives, then, for forcing the creation of this backdoor tool, would of course have been to create a tool that they can compel for use in the future, and had very little to do with the device they were trying to get into. This was, based on my best guess, the real agenda that the FBI was planning to push, not only back door level access into encryption, but a court precedent to force a manufacture to deliver all of the data on any device they desire to acquire in the future.

Whatever the real reasons were for the FBI’s actions during San Bernardino, one thing was for certain: FBI Director Comey’s publicly stated agenda did not match the events that were unfolding behind the scenes. The FBI clearly wasn’t interested in getting into this phone at first. They canceled meetings with at least one expert about it, there are no reports of them ever reaching out to security researchers who had submitted Apple security bugs, there is no record of them ever checking surveillance for Farook to input his PIN anywhere; there’s a significant lack of evidence to support the notion that FBI ever wanted into the phone. At the very least, it was about setting precedent. At the very worst, further abuse of the All Writs Act were in the works.

It seems as though the same type of private agenda is happening now with our presidential election. The effects of this have already become evident: Many are arguing that NC may have been swayed by Comey’s letter and the FBI’s recent public disclosures of what is portrayed in the released documents as a corruption investigation. The FBI has violated their own procedures by releasing all of this on the bleeding edge of an election. There is no question in my mind that the FBI’s publicly stated agenda doesn’t match their private one here either. As I said, there is a pattern emerging that FBI Director Comey seems to mislead the public about his real agenda, and at this point, I think there’s enough smoke that Congress should be looking into his entire history with the agency to see where else this pattern might have existed.

iPhone 7 And 7 Plus Still Limited In The UK

The iPhone 7 and 7 Plus has had stock issues ever since launching in the UK. With Apple stores in the USA now having plenty of stock, for some reason over here we’re still lacking.

Unfortunately over here, they are still in extremely short supply. With next to no stock available in any stores and even iPhone Upgrade Program users struggling to get their hands on it, this wasn’t Apple’s smoothest launch.

I carried out a little experiment of my own, to see how soon I could get my hands on Apple’s new ‘premium’ standard device, the Jet Black model. Even though I would never buy one due to it being a complete finger print magnet, it’s interesting to see just how short a supply they have. This has been rumoured to be because there is an extremely high quality control process with over 50% not reaching the desired standard and being re-milled.

iPhone 7 Jet Black Configuration

With only the 128GB or 256GB options available to me, there was essentially no stock saying that I wouldn’t receive it for around a month, with most people’s orders being pushed back further. For people who have brought their devices from a mobile carrier, a lot of carrier’s are operating reimbursement voucher schemes, either giving money off  the device purchase itself or giving around £50-£100 vouchers off of items such as accessories from their own stores by way of apology.

iPhone 7 Availability

The bad news only continues for Store Stock as I was unable to pick up one within a 30 mile radius of me, with stores beyond this still not having any stock.

It was rumoured that Apple didn’t actually have as much stock as they normally do for iPhone launches due to late production but also for advertising purposes as headlines reading “Apple can’t supply enough iPhone’s” are better than “Apple have an excess of iPhones”.

What do you think? Is this launch a bit of a shambles or is it some clever advertising going on. Did you have much luck getting your iPhone near launch date? Leave a comment below letting us know.

The post iPhone 7 And 7 Plus Still Limited In The UK appeared first on iJailbreak | Jailbreak And iOS News.

WhatsApp Forensic Artifacts: Chats Aren’t Being Deleted

Sorry, folks, while experts are saying the encryption checks out in WhatsApp, it looks like the latest version of the app tested leaves forensic trace of all of your chats, even after you’ve deleted, cleared, or archived them… even if you “Clear All Chats”. In fact, the only way to get rid of them appears to be to delete the app entirely.

whatsapp

To test, I installed the app and started a few different threads. I then archived some, cleared, some, and deleted some threads. I made a second backup after running the “Clear All Chats” function in WhatsApp. None of these deletion or archival options made any difference in how deleted records were preserved. In all cases, the deleted SQLite records remained intact in the database.

Just to be clear, WhatsApp is deleting the record (they don’t appear to be trying to intentionally preserve data), however the record itself is not being purged or erased from the database, leaving a forensic artifact that can be recovered and reconstructed back into its original form.

A Common Problem

Forensic trace is common among any application that uses SQLite, because SQLite by default does not vacuum databases on iOS (likely in an effort to prevent wear). When a record is deleted, it is simply added to a “free list”, but free records do not get overwritten until later on when the database needs the extra storage (usually after many more records are created). If you delete large chunks of messages at once, this causes large chunks of records to end up on this “free list”, and ultimately takes even longer for data to be overwritten by new data. There is no guarantee the data will be overwritten by the next set of messages. In other apps, I’ve often seen artifacts remain in the database for months.

The core issue here is that ephemeral communication is not ephemeral on disk. This is a problem that Apple has struggled with as well, which I’ve explained and made design recommendations recently in this blog post.

Apple’s iMessage has this problem and it’s just as bad, if not worse. Your SMS.db is stored in an iCloud backup, but copies of it also exist on your iPad, your desktop, and anywhere else you receive iMessages. Deleted content also suffers the same fate.

The way to measure “better” in this case is by the level of forensics trace an application leaves. Signal leaves virtually nothing, so there’s nothing to worry about. No messy cleanup. Wickr takes advantage of Apple’s CoreData and encrypts their database using keys stored in the keychain (much more secure). Other apps would do well to respect the size of the forensic footprint they’re leaving.

Copied to Backups

Simply preserving deleted data on a secure device is not usually a significant issue, but when that data comes off the device as freely as WhatsApp’s database does, it poses a rather serious risk to privacy. Unfortunately, that’s what’s happening here and why this is something users should be aware of.

The WhatsApp chat database gets copied over from the iPhone during a backup, which means it will show up in your iCloud backup and in a desktop backup. Fortunately, desktop backups can be encrypted by enabling the “Encrypt Backups” option in iTunes. Unfortunately, iCloud backups do not honor this encryption, leaving your WhatsApp database subject to law enforcement warrants.

Turning off iCloud and using encrypted backups for your desktop doesn’t necessarily mean you’re out of the woods. If you used a weak password that can be cracked by popular forensics tools, such as Elcomsoft’s suite of tools, the backup could be decrypted. Other tools can be used to attack your desktop keychain, where many users store their backup password.

What does this mean?

  • Law enforcement can potentially issue a warrant with Apple to obtain your deleted WhatsApp chat logs, which may include deleted messages. None of your iCloud backup content will be encrypted with your backup password (that’s on Apple, not WhatsApp).
    • NOTE: This is “iCloud backup” I’m referring to, and is independent of and irrelevant to whether or not you use WhatsApp’s built-in iCloud sync.
  • Anyone with physical access to your phone could create a backup with it, if access is compelled (e.g. fingerprint, passcode, or simply seizes it unlocked). This content will be encrypted with your backup password (if you’ve set one).
  • Anyone with physical access to your computer could copy this data from an existing, unencrypted backup, or potentially decrypt it using password breaking tools, or recover the password from your keychain. If passwords are compelled in your country, you may also be forced to assist law enforcement.

Should everybody panic?

LOL, no. But you should be aware of WhatsApp’s footprint.

How can you mitigate this as an end-user?

  • Use iTunes to set a long, complex backup password for your phone. Do NOT store this password in the keychain, otherwise it could potentially be recovered using Mac forensics tools. This will cause the phone to encrypt all desktop backups coming out of it, even if it’s talking to a forensics tool.
    • NOTE: If passwords are compelled in your country, you may still be forced to provide your backup password to law enforcement.
  • Consider pair locking your device using Configurator. I’ve written up a howto for this; it will prevent anybody else who steals your passcode, or compels a fingerprint from being able to pair or use forensics tools with your phone. This is irreversible without restoring the phone, so you’ll need to be aware of the risks.
  • Disable iCloud backups, as these do not honor your backup password, and the clear text database can be obtained, with a warrant, by law enforcement.
  • Periodically, delete the application from your device and reinstall it to flush out the database. This appears to be the only way to flush out deleted records and start fresh.
    • NOTE: This will not delete databases from existing iCloud backups from the cloud.

How WhatsApp Can Fix This

Software authors should be sensitive to forensic trace in their coding. The design choices they make when developing a secure messaging app has critical implications for journalists, political dissenters, those in countries that don’t respect free speech, and many others. A poor design choice could quite realistically result in innocent people – sometimes people crucial to liberty – being imprisoned.

There are a number of ways WhatsApp could mitigate this in future versions of their application:

  • The SQLite database does not need to come off in a backup at all. The file itself can be marked in such a way that it will not be backed up. The manufacturer may have set this behavior so that restoring to a new device will not cause you to lose your message history. Unfortunately, the tradeoff for this feature is that it becomes much easier to obtain a copy of this database.
  • In my book Hacking and Securing iOS Applications, I outline a technique that can overwrite the SQLite record content “in place” prior to deleting a record. While the record itself will remain on the free list, using this technique will clear the content out.
  • A better solution is setting PRAGMA secure_delete=ON prior to issuing the delete; this will cause the deleted content to be overwritten automatically. (thanks to Richard Hipp for sending me this information).
  • Using an alternative storage backing such as raw files, or encrypted CoreData, could be more secure. The file system is easy to implement, and Apple’s encryption scheme would drop the file encryption key whenever a file is deleted. It may not be as pretty as SQLite, but Apple’s file-level encryption is very solid in handling deleted files. Apple uses a binary property list for archival, which is sometimes used to store live message data too on the desktop. Wickr’s encrypted CoreData approach is similarly quite secure, so long as the database keys remain on the phone. Simply using a separate SQLite file for each thread, then deleting it when finished, would be a significant improvement, even if incorporating some of the other techniques described above.

WSJ Describes Reckless Behavior by FBI in Terrorism Case

The Wall Street Journal published an article today citing a source at the FBI is planning to tell the White House that “it knows so little about the hacking tool that was used to open terrorist’s iPhone that it doesn’t make sense to launch an internal government review”. If true, this should be taken as an act of recklessness by the FBI with regards to the Syed Farook case: The FBI apparently allowed an undocumented tool to run on a piece of high profile, terrorism-related evidence without having adequate knowledge of the specific function or the forensic soundness of the tool.

Best practices in forensic science would dictate that any type of forensics instrument needs to be tested and validated. It must be accepted as forensically sound before it can be put to live evidence. Such a tool must yield predictable, repeatable results and an examiner must be able to explain its process in a court of law. Our court system expects this, and allows for tools (and examiners) to face numerous challenges based on the credibility of the tool, which can only be determined by a rigorous analysis. The FBI’s admission that they have such little knowledge about how the tool works is an admission of failure to evaluate the science behind the tool; it’s core functionality to have been evaluated in any meaningful way. Knowing how the tool managed to get into the device should be the bare minimum I would expect anyone to know before shelling out over a million dollars for a solution, especially one that was going to be used on high-profile evidence.

A tool should not make changes to a device, and any changes should be documented and repeatable. There are several other variables to consider in such an effort, especially when imaging an iOS device. Apart from changes made directly by the tool (such as overwriting unallocated space, or portions of the file system journal), simply unlocking the device can cause the operating system to make a number of changes, start background tasks which could lead to destruction of data, or cause other changes unintentionally. Without knowing how the tool works, or what portions of the operating system it affects, what vulnerabilities are exploited, what the payload looks like, where the payload is written, what parts of the operating system are disabled by the tool, or a host of other important things – there is no way to effectively measure whether or not the tool is forensically sound. Simply running it against a dozen other devices to “see if it works” is not sufficient to evaluate a forensics tool – especially one that originated from a grey hat hacking group, potentially with very little actual in-house forensics expertise.

It is highly unlikely that any agency could effectively evaluate the forensic soundness of any tool without having an understanding of how it works. The FBI’s arguments to the White House with regards to this appear to many as an attempt to simply skirt the vulnerabilities equities process.

There are only two possible conclusions to draw from all of this: either the FBI is lying to the White House (misleading the President), and actually does possess enough knowledge about the tool to warrant a review, or the FBI never evaluated and validated the safety of this tool, never learned how it worked, and recklessly used it on a piece of terrorism-related evidence so high profile that it warranted an egregious abuse of the constitution when ordering Apple to assist… yet was so inconsequential a piece of evidence that the FBI didn’t have a problem running an ordinary jailbreak tool on it. This would not fall short of misleading the court.

The Syed Farook case has been wrought with recklessness. Numerous mistakes were made early on, as I’ve written about, such as changing the iCloud password and possibly even powering down the device (or letting it die), locking the encryption. When the FBI demonstrated that only a mere 30 days was necessary in order to get into the iPhone, many interpreted this as proof that adequate due diligence had not been done prior to filing for an All Writs Act order against Apple. Beyond this case, the FBI has pulled out of their NY iPhone case after the passcode was given to them – further suggesting the FBI’s unwillingness or inability to do their job, to the degree of abusing the All Writs Act as an alternative to good police work.

Ironically, the NY case highlighted the DOJ’s reluctance to use an undocumented hacking tool named IP-BOX, which was essentially a “black box” to brute force PINs on iOS 7/8 devices, and listed as one major reason Apple’s help was required. Ironically, the FBI is claiming to have done the very thing here that they argued they shouldn’t do with regards to the NY case: Use an undocumented, opaque hacking tool that they were unable to fully understand. It would seem that situation ethics are in play here.

This sets a dangerous practice in motion: The FBI has offered this tool to any other law enforcement agencies that need it. So the FBI is endorsing the use of an untested tool that they have no idea how it works, for every kind of case that could go through our court system. A tool that was also only tested, if at all, for one very specific case now being used on a very broad set of types of data and evidence, which it could easily damage, alter, or – more likely – see thrown out of cases as soon as it’s challenged. If the FBI truly does not know how this tool works, they’ve created an extremely dangerous situation for all of us.

Social Media Auto Publish Powered By : XYZScripts.com