Candidate release of source code.

This commit is contained in:
Dan
2019-03-26 13:45:32 -04:00
parent db81e6b3b0
commit 79d8f164f8
12449 changed files with 2800756 additions and 16 deletions

56
.gitattributes vendored Normal file
View File

@@ -0,0 +1,56 @@
# Set the default behavior, in case people don't have core.autocrlf set.
* text=auto
# Explicitly declare text files you want to always be normalized and converted
# to native line endings on checkout.
*.java text
*.gradle text
*.manifest text
*.css text
*.htm text
*.html text
*.js text
*.json text
*.jsp text
*.jspf text
*.jspx text
*.properties text
*.sh text
*.tld text
*.txt text
*.tag text
*.xml text
*.c text
*.h text
*.cpp text
*.hh text
*.cc text
# Declare files that will always have CRLF line endings on checkout.
*.sln text eol=crlf
*.vcproj text eol=crlf
*.vcxproj text eol=crlf
*.bat text eol=crlf
# Denote all files that are truly binary and should not be modified.
*.png binary
*.jpg binary
*.class binary
*.dll binary
*.ear binary
*.gif binary
*.ico binary
*.jar binary
*.jpeg binary
*.so binary
*.war binary
*.pdf binary
*.exe binary
*.lib binary
*.sa binary
*.gz binary
*.gzf binary
*.tgz binary
*.tar binary

67
.gitignore vendored
View File

@@ -1,14 +1,63 @@
.gradle
*.class
Thumbs.db
.DS_Store
.svn
excludedFiles.txt
.classpath
.project
ghidra.repos.config
/*/*/*/*/*/bin/
/*/*/*/*/*/build/
/*/*/*/*/bin/
/*/*/*/*/build/
/*/*/*/bin/
/*/*/*/build/
/*/*/bin/
/*/*/build/
/*/bin/
/*/build/
/build/
/bin/
# Ignore Gradle GUI config
gradle-app.setting
**/dist
repositories/
*.sla
# Avoid ignoring Gradle wrapper jar file (.jar files are usually ignored)
!gradle-wrapper.jar
# Misc files
*.setting
*.settings
*.directory
.gradle/
.settings/
# Cache of project
.gradletasknamecache
# File locks
*.ulock
*.lock
# # Work around https://youtrack.jetbrains.com/issue/IDEA-116898
# gradle/wrapper/gradle-wrapper.properties
# Gradle creates these per developer
**/vs/
# Misc files produced while executing application
Ghidra/.ghidraSvrKeys
wrapper.log*
# Ignore object files
*.o
*.obj
# Ignore MS Visual Studio artifcats
Release
#Debug
*.ncb
*.suo
*.aps
*.vcproj.*
# Ignore UNIX backup files
*~
*.swp
# Ignore eclipse project files
.project
.classpath

275
DevGuide.md Normal file
View File

@@ -0,0 +1,275 @@
# Developer's Guide: Getting Started
Install OpenJDK 11 and make sure it's the default java.
Install Eclipse, at least version 2018-12, and ensure it is launched using OpenJDK 11.
Technically, you can launch with any JRE/JDK, but it's up to you ensure OpenJDK 11 is properly configured in Eclipse.
Optionally install Gradle 5.0, and ensure it is launched using OpenJDK 11.
These instructions assume you are using the gradle wrapper, so adjust the commands accordingly if you choose to use your own Gradle installation.
## Setup Repositories
Of course, you may choose any directory for your working copy, but these instructions will assume you have cloned the repo to `~/git`.
Be sure to adjust the commands to match your chosen working directory if different than suggested:
```bash
cd ~/git
git clone git@github.com:NationalSecurityAgency/ghidra.git
```
Ghidra's build uses artifact named as available in Maven Central and Bintray JCenter, when possible.
Unfortunately, in some cases, the artifact or the particular version we desire is not available.
So, in addition to mavenCentral and jcenter, you must configure a flatDir-style repository for manually-downloaded dependencies.
Create `~/.gradle/init.d/repos.gradle` with the following contents:
```groovy
ext.HOME = System.getProperty('user.home')
allprojects {
repositories {
mavenCentral()
jcenter()
flatDir name:'flat', dirs:["$HOME/flatRepo"]
}
}
```
You should also create the `~/flatRepo` folder to hold the manually-downloaded dependencies:
```bash
mkdir ~/flatRepo
```
If you prefer not to modify your user-wide Gradle configuration, you may use
Gradle's other init script facilities, but you're on your own.
## Get Dependencies for FileFormats:
Download `dex-tools-2.0.zip` from the dex2jar project's releases page on GitHub.
Unpack the `dex-*.jar` files from the `lib` directory to `~/flatRepo`:
```bash
cd ~/Downloads # Or wherever
curl -OL https://github.com/pxb1988/dex2jar/releases/download/2.0/dex-tools-2.0.zip
unzip dex-tools-2.0.zip
cp dex2jar-2.0/lib/dex-*.jar ~/flatRepo/
```
Download `AXMLPrinter2.jar` from the "android4me" archive on code.google.com.
Place it in `~/flatRepo`:
```bash
cd ~/flatRepo
curl -OL https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/android4me/AXMLPrinter2.jar
```
## Get Dependencies for DMG:
Download `hfsexplorer-0_21-bin.zip` from www.catacombae.org.
Unpack the `lib` directory to `~/flatRepo.`:
```bash
cd ~/Downloads # Or wherever
curl -OL https://sourceforge.net/projects/catacombae/files/HFSExplorer/0.21/hfsexplorer-0_21-bin.zip
mkdir hfsx
cd hfsx
unzip ../hfsexplorer-0_21-bin.zip
cd lib
cp csframework.jar hfsx_dmglib.jar hfsx.jar iharder-base64.jar ~/flatRepo/
```
## Import Gradle Project
At this point, you may import Ghidra into Eclipse using the integrated BuildShip plugin.
If you prefer another IDE, there's no reason it shouldn't work, but you're on your own.
Note that the GhidraDevPlugin requires Eclipse PDE.
Close this project to clean up the errors, unless you are developing the GhidraDevPlugin.
You may see build path errors until the environment is properly prepared, as described below.
## Prepare the Environment
There are a few preparatory tasks you should execute before, or immediately after, importing the project.
These tasks will build and index the online help, and place it somewhere accessible to Ghidra when launched from Eclipse, among other things.
This task also attempts to unpack some SDKs and/or larger dependencies required by Ghidra.
We do not provide these packages out-of-the-box because of technical and legal constraints on our distributing them.
These include the Eclipse CDT, PyDev for Eclipse, and "Yet another Java service wrapper."
If you would like to build the dependent modules, please see the relevant sections below.
For now, we will exclude the affected unpack tasks.
From the project root, execute:
```bash
./gradlew prepDev -x yajswDevUnpack
```
Optionally, to pre-compile all the language modules, you may also execute:
```bash
./gradlew sleighCompile
```
Refresh the Gradle project in Eclipse.
You should not see any errors at this point, and you can accomplish many development tasks.
However, some features of Ghidra will not be functional until further steps are taken.
### Building the natives
Some of Ghidra's components are built for the native platform.
We currently support Linux, macOS, and Windows 64-bit x86 systems.
Others should be possible, but we do not support them.
#### decompile
Install bison and flex.
Now build using Gradle:
On Linux:
```bash
./gradlew decompileLinux64Executable
```
On macOS:
```bash
./gradlew decompileOsx64Executable
```
On Windows:
```cmd
gradlew decompileWin64Executable
```
#### demangler_gnu
Build using Gradle:
On Linux:
```bash
./gradlew demangler_gnuLinux64Executable
```
On macOS:
```bash
./gradlew demangler_gnuOsx64Executable
```
On Windows:
```cmd
gradlew demangler_gnuWin64Executable
```
#### sleigh
The sleigh compiler has been ported to Java, and Ghidra will automatically compile slaspec files that it finds are out of date.
The native sleigh compiler may still be useful for those who'd like quicker feedback by compiling from the command line. To build the native sleigh compiler, install bison and flex.
Now, use Gradle:
On Linux:
```bash
./gradlew sleighLinux64Executable
```
On macOS:
```bash
./gradlew sleighOsx64Executable
```
On Windows:
```cmd
gradlew sleighWin64Executable
```
### Get Dependencies for GhidraDev
Building the GhidraDev plugin for Eclipse requires the CDT and PyDev plugins for Eclipse.
Download `cdt-8.6.0.zip` from The Eclipse Foundation, and place it in a directory named:
`ghidra.bin/GhidraBuild/EclipsePlugins/GhidraDev/buildDependencies/`.
`ghidra.bin` must be a sibling of `ghidra`.
To respect the CDT project's resources, you will need to download the file using a browser, or at the very least, locate a suitable mirror on your own:
```bash
cd ~/Downloads # Or wherever
curl -OL http://$CHOOSE_YOUR_MIRROR/pub/eclipse/tools/cdt/releases/8.6/cdt-8.6.0.zip
mkdir -p ~/git/ghidra.bin/GhidraBuild/EclipsePlugins/GhidraDev/buildDependencies/
cp ~/Downloads/cdt-8.6.0.zip ~/git/ghidra.bin/GhidraBuild/EclipsePlugins/GhidraDev/buildDependencies/
```
Download `PyDev 6.3.1.zip` from www.pydev.org, and place it in the same directory:
```bash
cd ~/Downloads # Or wherever
curl -OL https://sourceforge.net/projects/pydev/files/pydev/PyDev%206.3.1/PyDev%206.3.1.zip
cp ~/Downloads/'PyDev 6.3.1.zip' ~/git/ghidra.bin/GhidraBuild/EclipsePlugins/GhidraDev/buildDependencies/
```
Use Gradle to unpack the dependencies for development and building.
First, you will need to uncomment the GhidraDev project in the ```settings.gradle``` file.
Then, from your clone:
```bash
./gradlew cdtUnpack pyDevUnpack
```
### Get Dependencies for GhidraServer
Building the GhidraServer requires "Yet another Java service wrapper" (yajsw) version 12.12.
Note that building the full Ghidra package requires building the GhidraServer.
Download `yajsw-stable-12.12.zip` from their project on www.sourceforge.net, and place it in a directory named:
`ghidra.bin/Ghidra/Features/GhidraSerer/`:
```bash
cd ~/Downloads # Or wherever
curl -OL https://sourceforge.net/projects/yajsw/files/yajsw/yajsw-stable-12.12/yajsw-stable-12.12.zip
mkdir -p ~/git/ghidra.bin/Ghidra/Features/GhidraServer/
cp ~/Downloads/yajsw-stable-12.12.zip ~/git/ghidra.bin/Ghidra/Features/GhidraServer/
```
Use Gradle to unpack the wrapper for development.
From your clone:
```bash
./gradlew yajswDevUnpack
```
# Build the full Ghidra package
If you've followed all of the steps above, except perhaps importing to Eclipse, you should be able to produce a build.
Before building, you may want to update the version and release name.
These properties are kept in `Ghidra/application.properties`.
If you want it included, you must also build the GhidraDevPlugin module first.
We do not yet have instructions for building the GhidraDevPlugin.
It should be relatively straightforward for anyone familiar with Eclipse PDE.
To build the full package, use Gradle:
```bash
./gradlew buildGhidra
```
The output will be placed in `build/dist/`.
It will be named according to the version, release name, build date, and platform.
To test it, unzip it where you like, and execute `./ghidraRun`.
# Building Supporting Data
Some features of Ghidra require the curation of rather extensive data bases.
These include the Data Type Archives and Function ID Databases, both of which require collecting header files and libraries for the relevant SDKs and platforms.
Much of this work is done by hand, and the results are simply copied into the build.
We intend to document these procedures as soon as we can.
In the meantime, those artifacts can always be extracted from our binary release.
## Building Data Type Archives
TODO
## Building FID Databases
TODO

View File

@@ -0,0 +1,2 @@
MODULE FILE LICENSE: os/linux64/cabextract GPL 3
MODULE FILE LICENSE: os/osx64/cabextract GPL 3

View File

@@ -0,0 +1,69 @@
apply plugin: 'eclipse'
eclipse.project.name = 'GPL CabExtract'
project.ext.cabextract = "cabextract-1.6"
/*********************************************************************************
* CabExtract extraction task
*
* Unpacks the cabextract tar file that's needed for the symbol server. This
* is only unpacked for building the tool; once it's built, this unzipped
* archive is removed.
*
* NOTE: Ant is used so that timestamps are properly preserved, failure to
* do so can cause the aclocal utility to be required which may be missing!
*********************************************************************************/
task unpackCabExtract (type: Copy) {
doFirst {
delete file("build/${cabextract}")
}
doLast {
// Force all unpacked files to have the same timestamp
ant.touch() {
fileset(dir: file("build/${cabextract}"))
}
}
from tarTree(file("data/${cabextract}.tar.gz"))
into 'build'
// Force the task to be executed every time by setting to false.
// This is done since configure changes the contents for a platform
outputs.upToDateWhen { false }
}
/*********************************************************************************
* CabExtract platform specific tasks
*
* The cabextract tool requires that its 'configure' script is called before make.
*********************************************************************************/
['linux64', 'osx64'].each { platform ->
def configureName = "${platform}CabExtractConfigure"
def makeName = "${platform}CabExtractMake" // native Make task found automatically
task (configureName, type: Exec) {
group "private"
workingDir "build/${cabextract}"
executable "./configure"
dependsOn unpackCabExtract
}
task (makeName, type: Exec) {
group "private"
workingDir "build/${cabextract}"
executable "make"
dependsOn configureName
doLast {
copy {
from "build/${cabextract}/cabextract"
into "build/os/${platform}"
}
}
}
}

View File

@@ -0,0 +1,8 @@
##VERSION: 2.0
##MODULE IP: GPL 3
##MODULE IP: Public Domain
.classpath||Public Domain||||END|
.project||Public Domain||||END|
Module.manifest||Public Domain||||END|
build.gradle||Public Domain||||END|
data/cabextract-1.6.tar.gz||GPL 3||||END|

Binary file not shown.

2
GPL/DMG/Module.manifest Normal file
View File

@@ -0,0 +1,2 @@
# GPL because we have linked to GPL hsfx.jar
MODULE FILE LICENSE: data/lib/dmg.jar GPL 3

54
GPL/DMG/build.gradle Normal file
View File

@@ -0,0 +1,54 @@
apply plugin: 'eclipse'
eclipse.project.name = 'GPL DMG'
/*********************************************************************************
*
* Define a new source set for dmg source because it is not part of Ghidra, it is
* a standalone application that is executed and called from Ghidra.
*
* see DmgServerProcessManager
*
*********************************************************************************/
sourceSets {
dmg {
java {
srcDir 'src/dmg/java'
}
}
}
eclipse.classpath.plusConfigurations += [configurations.dmgCompile]
dependencies {
dmgCompile ':csframework@jar'
dmgCompile ':hfsx@jar'
dmgCompile ':hfsx_dmglib@jar'
}
/***************************************************************************************
*
* Task to create the dmg.jar file
*
***************************************************************************************/
task dmgJar(type: Jar) {
from sourceSets.dmg.output
destinationDir = file("build/data/lib")
baseName = 'dmg'
}
jar {
doLast {
File f = file("build/libs/DMG.jar")
delete "build/libs"
}
}
/***************************************************************************************
*
* plugin the jar task into global task for building and zipping contribs
*
***************************************************************************************/
assemble.dependsOn dmgJar

View File

@@ -0,0 +1,20 @@
##VERSION: 2.0
##MODULE IP: GPL 3
##MODULE IP: LGPL 2.1
##MODULE IP: Public Domain
.classpath||Public Domain||||END|
.project||Public Domain||||END|
Module.manifest||Public Domain||||END|
build.gradle||Public Domain||||END|
data/lib/catacombae_csframework.jar||LGPL 2.1||||END|
data/lib/catacombae_hfsx.jar||GPL 3||||END|
data/lib/catacombae_hfsx_dmglib.jar||GPL 3||||END|
data/lib/catacombae_iharder-base64.jar||GPL 3||||END|
data/lib/hfsexplorer-0_21-src.zip||GPL 3||||END|
data/os/win32/llio_amd64.dll||GPL 3||||END|
data/os/win32/llio_i386.dll||GPL 3||||END|
data/os/win32/llio_ia64.dll||GPL 3||||END|
data/os/win64/llio_amd64.dll||GPL 3||||END|
data/os/win64/llio_i386.dll||GPL 3||||END|
data/os/win64/llio_ia64.dll||GPL 3||||END|
data/server_memory.cfg||Public Domain||||END|

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1 @@
2048

View File

@@ -0,0 +1,116 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import mobiledevices.dmg.ghidra.GBinaryReader;
/**
* Represents a BTHeaderRec structure.
*
* @see <a href="https://opensource.apple.com/source/xnu/xnu-792/bsd/hfs/hfs_format.h.auto.html">hfs/hfs_format.h</a>
*/
public class BTreeHeaderRecord /*implements StructConverter*/ {
private short treeDepth;
private int rootNode;
private int leafRecords;
private int firstLeafNode;
private int lastLeafNode;
private short nodeSize;
private short maxKeyLength;
private int totalNodes;
private int freeNodes;
private short reserved1;
private int clumpSize;
private byte btreeType;
private byte keyCompareType;
private int attributes;
private int[] reserved;
BTreeHeaderRecord(GBinaryReader reader) throws IOException {
this.treeDepth = reader.readNextShort();
this.rootNode = reader.readNextInt();
this.leafRecords = reader.readNextInt();
this.firstLeafNode = reader.readNextInt();
this.lastLeafNode = reader.readNextInt();
this.nodeSize = reader.readNextShort();
this.maxKeyLength = reader.readNextShort();
this.totalNodes = reader.readNextInt();
this.freeNodes = reader.readNextInt();
this.reserved1 = reader.readNextShort();
this.clumpSize = reader.readNextInt();
this.btreeType = reader.readNextByte();
this.keyCompareType = reader.readNextByte();
this.attributes = reader.readNextInt();
this.reserved = reader.readNextIntArray(16);
}
public short getTreeDepth() {
return treeDepth;
}
public int getRootNode() {
return rootNode;
}
public int getLeafRecords() {
return leafRecords;
}
public int getFirstLeafNode() {
return firstLeafNode;
}
public int getLastLeafNode() {
return lastLeafNode;
}
public short getNodeSize() {
return nodeSize;
}
public short getMaxKeyLength() {
return maxKeyLength;
}
public int getTotalNodes() {
return totalNodes;
}
public int getFreeNodes() {
return freeNodes;
}
public short getReserved1() {
return reserved1;
}
public int getClumpSize() {
return clumpSize;
}
public byte getBtreeType() {
return btreeType;
}
public byte getKeyCompareType() {
return keyCompareType;
}
public int getAttributes() {
return attributes;
}
public int[] getReserved() {
return reserved;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// return StructConverterUtil.toDataType( this );
// }
}

View File

@@ -0,0 +1,17 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
/**
* Represents a BTHeaderRec attributes.
*
* @see <a href="https://opensource.apple.com/source/xnu/xnu-792/bsd/hfs/hfs_format.h.auto.html">hfs/hfs_format.h</a>
*/
public final class BTreeHeaderRecordAttributes {
public final static int kBTBadCloseMask = 0x00000001;
public final static int kBTBigKeysMask = 0x00000002;
public final static int kBTVariableIndexKeysMask = 0x00000004;
}

View File

@@ -0,0 +1,47 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import mobiledevices.dmg.ghidra.GBinaryReader;
/**
* Represents a Map Record.
*
* @see <a href="https://developer.apple.com/library/archive/technotes/tn/tn1150.html">Map Record</a>
*/
public class BTreeMapRecord /*implements StructConverter*/ {
private byte[] bitmap;
protected BTreeMapRecord(GBinaryReader reader, BTreeHeaderRecord headerRecord)
throws IOException {
this.bitmap = reader.readNextByteArray(headerRecord.getNodeSize() - 256);
}
/**
* Returns the map record node allocation bitmap.
* @return the map record node allocation bitmap
*/
public byte[] getBitmap() {
return bitmap;
}
/**
* Returns true if the specified node index is used.
* Returns false if the specified node index is free.
* @param nodeIndex the node index
* @return true if the specified node index is used, false if free
*/
public boolean isNodeUsed(int nodeIndex) {
int block = bitmap[nodeIndex / 8] & 0xff;
return (block & (1 << 7 - (nodeIndex % 8))) != 0;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// return StructConverterUtil.toDataType( this );
// }
}

View File

@@ -0,0 +1,128 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import mobiledevices.dmg.ghidra.GBinaryReader;
/**
* Represents a BTNodeDescriptor structure.
*
* @see <a href="https://opensource.apple.com/source/xnu/xnu-792/bsd/hfs/hfs_format.h.auto.html">hfs/hfs_format.h</a>
*/
public class BTreeNodeDescriptor /*implements StructConverter*/ {
private int fLink;
private int bLink;
private byte kind;
private byte height;
private short numRecords;
private short reserved;
private List<Short> _recordOffsets = new ArrayList<Short>();
private List<BTreeNodeRecord> _records = new ArrayList<BTreeNodeRecord>();
BTreeNodeDescriptor(GBinaryReader reader) throws IOException {
this.fLink = reader.readNextInt();
this.bLink = reader.readNextInt();
this.kind = reader.readNextByte();
this.height = reader.readNextByte();
this.numRecords = reader.readNextShort();
this.reserved = reader.readNextShort();
}
protected void readRecordOffsets(GBinaryReader reader, long nodeStartIndex,
BTreeHeaderRecord header) throws IOException {
long position = nodeStartIndex + header.getNodeSize() - 2;
while (true) {
short recordOffset = reader.readShort(position);
if (recordOffset == 0) {
break;
}
_recordOffsets.add(recordOffset);
position = position - 2;
}
}
protected void readRecords(GBinaryReader reader, long nodeStartIndex) throws IOException {
for (int i = 0; i < getNumRecords(); ++i) {
short offset = getRecordOffsets().get(i);
long recordIndex = (offset & 0xffff) + nodeStartIndex;
reader.setPointerIndex(recordIndex);
BTreeNodeRecord record = new BTreeNodeRecord(reader, this);
_records.add(record);
}
}
public List<Short> getRecordOffsets() {
return _recordOffsets;
}
public List<BTreeNodeRecord> getRecords() {
return _records;
}
/**
* The node number of the next node of this type.
* Or, zero ( 0 ) if this is the last node.
* @return node number of the next node of this type
*/
public int getFLink() {
return fLink;
}
/**
* The node number of the previous node of this type.
* Or, zero ( 0 ) if this is the first node.
* @return node number of the previous node of this type
*/
public int getBLink() {
return bLink;
}
/**
* Returns the key of this node.
* @return the key of this node
* @see BTreeNodeKinds
*/
public byte getKind() {
return kind;
}
/**
* Returns the level, or depth, of this node in the B-tree hierarchy.
* @return the level, or depth, of this node in the B-tree hierarchy
*/
public byte getHeight() {
return height;
}
/**
* Returns the number of records in this node.
* @return the number of records in this node
*/
public short getNumRecords() {
return numRecords;
}
/**
* This field is reserved.
* @return this field is reserved
*/
public short getReserved() {
return reserved;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// return StructConverterUtil.toDataType( this );
// }
}

View File

@@ -0,0 +1,19 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
/**
* Represents kinds of BTNodeDescriptor.
*
* @see <a href="https://opensource.apple.com/source/xnu/xnu-792/bsd/hfs/hfs_format.h.auto.html">hfs/hfs_format.h</a>
* @see <a href="https://developer.apple.com/library/archive/technotes/tn/tn1150.html">B-Trees</a>
*/
public final class BTreeNodeKinds {
public final static byte kBTLeafNode = -1;
public final static byte kBTIndexNode = 0;
public final static byte kBTHeaderNode = 1;
public final static byte kBTMapNode = 2;
}

View File

@@ -0,0 +1,140 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import mobiledevices.dmg.decmpfs.DecmpfsHeader;
import mobiledevices.dmg.ghidra.GBinaryReader;
import mobiledevices.dmg.xattr.XattrConstants;
public class BTreeNodeRecord /*implements StructConverter*/ {
private int unknown0;
private int fileID;
private int unknown2;
private String type;
private int unknown3;
private int unknown4;
private int unknown5;
private int recordLength;
private short _typeLength;
private BTreeNodeDescriptor _descriptor;
private DecmpfsHeader _decmpfsHeader;
private long _offset;
BTreeNodeRecord( GBinaryReader reader, BTreeNodeDescriptor descriptor ) throws IOException {
_offset = reader.getPointerIndex();
unknown0 = reader.readNextInt();
fileID = reader.readNextInt();
unknown2 = reader.readNextInt();
_typeLength = reader.readNextShort();
type = readType( reader );
unknown3 = reader.readNextInt();
switch ( descriptor.getKind() ) {
case BTreeNodeKinds.kBTHeaderNode: {
break;
}
case BTreeNodeKinds.kBTIndexNode: {
break;
}
case BTreeNodeKinds.kBTLeafNode: {
unknown4 = reader.readNextInt();
unknown5 = reader.readNextInt();
recordLength = reader.readNextInt();
break;
}
case BTreeNodeKinds.kBTMapNode: {
break;
}
}
_descriptor = descriptor;
if ( descriptor.getKind() == BTreeNodeKinds.kBTLeafNode ) {
if ( getType().equals( XattrConstants.DECMPFS_XATTR_NAME ) ) {
_decmpfsHeader = new DecmpfsHeader( reader, getRecordLength() );
}
else if ( getType().equals( XattrConstants.KAUTH_FILESEC_XATTR_NAME ) ) {
//TODO
}
}
else if ( descriptor.getKind() == BTreeNodeKinds.kBTIndexNode ) {
if ( getType().equals( XattrConstants.DECMPFS_XATTR_NAME ) ) {
//TODO
}
}
}
private String readType( GBinaryReader reader ) throws IOException {
StringBuffer buffer = new StringBuffer();
for ( int i = 0 ; i < _typeLength ; ++i ) {
reader.readNextByte();//skip it...
buffer.append( (char) reader.readNextByte() );
}
return buffer.toString();
}
public String getType() {
return type;
}
public int getRecordLength() {
return recordLength;
}
public BTreeNodeDescriptor getDescriptor() {
return _descriptor;
}
public int getUnknown0() {
return unknown0;
}
public int getUnknown2() {
return unknown2;
}
public int getUnknown3() {
return unknown3;
}
public int getUnknown4() {
return unknown4;
}
public int getUnknown5() {
return unknown5;
}
public int getFileID() {
return fileID;
}
public DecmpfsHeader getDecmpfsHeader() {
return _decmpfsHeader;
}
public long getRecordOffset() {
return _offset;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// String name = StructConverterUtil.parseName( BTreeNodeRecord.class );
// Structure struct = new StructureDataType( name, 0 );
// struct.add( DWORD, "unknown0", null );
// struct.add( DWORD, "fileID", null );
// struct.add( DWORD, "unknown2", null );
// struct.add( WORD, "typeLength", null );
// struct.add( UNICODE, _typeLength * 2, "type", null );
// struct.add( DWORD, "unknown3", null );
// if ( _descriptor.getKind() == BTreeNodeKinds.kBTLeafNode ) {
// struct.add( DWORD, "unknown4", null );
// struct.add( DWORD, "unknown5", null );
// struct.add( DWORD, "recordLength", null );
// }
// try {
// struct.setName( name + '_' + struct.getLength() );
// }
// catch ( Exception e ) {
// }
// return struct;
// }
}

View File

@@ -0,0 +1,72 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import mobiledevices.dmg.ghidra.GBinaryReader;
public class BTreeRootNodeDescriptor extends BTreeNodeDescriptor {
private BTreeHeaderRecord headerRecord;
private BTreeUserDataRecord userDataRecord;
private BTreeMapRecord mapRecord;
private List<BTreeNodeDescriptor> nodes = new ArrayList<BTreeNodeDescriptor>();
public BTreeRootNodeDescriptor( GBinaryReader reader ) throws IOException {
super( reader );
headerRecord = new BTreeHeaderRecord( reader );
userDataRecord = new BTreeUserDataRecord( reader );
mapRecord = new BTreeMapRecord( reader, headerRecord );
nodes.add( this );
int nodeSize = headerRecord.getNodeSize() & 0xffff;
for ( int i = nodeSize ; i < reader.length() ; i += nodeSize ) {
reader.setPointerIndex( i );
BTreeNodeDescriptor node = new BTreeNodeDescriptor( reader );
nodes.add( node );
node.readRecordOffsets( reader, i, headerRecord );
node.readRecords( reader, i );
}
this.readRecordOffsets( reader, 0, headerRecord );
}
public BTreeHeaderRecord getHeaderRecord() {
return headerRecord;
}
public BTreeUserDataRecord getUserDataRecord() {
return userDataRecord;
}
public BTreeMapRecord getMapRecord() {
return mapRecord;
}
public BTreeNodeDescriptor getNode( int index ) {
try {
return nodes.get( index );
}
catch (Exception e) {
return null;
}
}
public List<BTreeNodeDescriptor> getNodes() {
return nodes;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// //we want to return the super class structure,
// //this class is synthetic
// return StructConverterUtil.toDataType( BTreeNodeDescriptor.class );
// }
}

View File

@@ -0,0 +1,14 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
public final class BTreeTypes {
/** Control file */
public final static byte kHFSBTreeType = (byte)0;
/** User bTree types start from 128 */
public final static byte kUserBTreeType = (byte)128;
/** */
public final static byte kReservedBTreeType = (byte)255;
}

View File

@@ -0,0 +1,31 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.btree;
import java.io.IOException;
import mobiledevices.dmg.ghidra.GBinaryReader;
/**
* Represents a User Data Record.
*
* @see <a href="https://developer.apple.com/library/archive/technotes/tn/tn1150.html">User Data Record</a>
*/
public class BTreeUserDataRecord /*implements StructConverter*/ {
private byte[] unused;
BTreeUserDataRecord(GBinaryReader reader) throws IOException {
this.unused = reader.readNextByteArray(128);
}
public byte[] getUnused() {
return unused;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// return StructConverterUtil.toDataType( this );
// }
}

View File

@@ -0,0 +1,22 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.decmpfs;
public final class DecmpfsCompressionTypes {
/** Uncompressed data in xattr. */
public final static int CMP_Type1 = 1;
/** Data stored in-line. */
public final static int CMP_Type3 = 3;
/** Resource fork contains compressed data. */
public final static int CMP_Type4 = 4;
/** ???? */
public final static int CMP_Type10 = 10;
public final static int CMP_MAX = 255;
}

View File

@@ -0,0 +1,13 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.decmpfs;
public final class DecmpfsConstants {
public final static int MAX_DECMPFS_XATTR_SIZE = 3802;
public final static byte [] DECMPFS_MAGIC_BYTES = { 'f', 'p', 'm', 'c' };
public final static String DECMPFS_MAGIC = new String( DECMPFS_MAGIC_BYTES );
}

View File

@@ -0,0 +1,79 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.decmpfs;
import java.io.IOException;
import mobiledevices.dmg.ghidra.GBinaryReader;
import mobiledevices.dmg.ghidra.GStringUtilities;
public class DecmpfsHeader /*implements StructConverter*/ {
private int compression_magic;
private int compression_type;
private long uncompressed_size;
private byte [] attr_bytes;
public DecmpfsHeader(GBinaryReader reader, int size) throws IOException {
long index = reader.getPointerIndex();
this.compression_magic = reader.readNextInt();
boolean originalEndian = reader.isLittleEndian();
reader.setLittleEndian( true );
this.compression_type = reader.readNextInt();
this.uncompressed_size = reader.readNextLong();
reader.setLittleEndian( originalEndian );
long endIndex = index + size + 1; //TODO always add 1????
if ( ( endIndex % 2 ) != 0 ) {
endIndex = endIndex - 1;
}
long nElements = endIndex - reader.getPointerIndex();
if ( ( nElements % 2 ) != 0 ) {//TODO
++nElements;
}
else if ( nElements < 0 ) {//TODO
System.err.println( "here" );
}
this.attr_bytes = reader.readNextByteArray( (int)nElements );
}
public String getCompressionMagic() {
return GStringUtilities.toString( compression_magic );
}
public int getCompressionType() {
return compression_type;
}
public long getUncompressedSize() {
return uncompressed_size;
}
public byte [] getAttrBytes() {
return attr_bytes;
}
// @Override
// public DataType toDataType() throws DuplicateNameException, IOException {
// String name = StructConverterUtil.parseName( DecmpfsHeader.class );
// Structure struct = new StructureDataType( name + "_" + attr_bytes.length, 0 );
//
// struct.add( STRING, 4, "compression_magic", null );
// struct.add( DWORD, "compression_type", null );
// struct.add( QWORD, "uncompressed_size", null );
//
// if ( attr_bytes.length > 0 ) {
// ArrayDataType byteArrayDT = new ArrayDataType( BYTE , attr_bytes.length, BYTE.getLength() );
// struct.add( byteArrayDT, "attr_bytes", null );
// }
// return struct;
// }
}

View File

@@ -0,0 +1,13 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.decmpfs;
public final class DecmpfsStates {
public final static int FILE_TYPE_UNKNOWN = 0;
public final static int FILE_IS_NOT_COMPRESSED = 1;
public final static int FILE_IS_COMPRESSED = 2;
public final static int FILE_IS_CONVERTING = 3; //file is converting from compressed to decompressed...
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,126 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
import java.io.*;
/**
* An implementation of ByteProvider where the underlying
* bytes are supplied by a random access file.
*/
public class GByteProvider implements Closeable {
private File file;
private GRandomAccessFile randomAccessFile;
/**
* Constructs a byte provider using the specified file
* @param file the file to open for random access
* @throws FileNotFoundException if the file does not exist
*/
public GByteProvider(File file) throws IOException {
this.file = file;
this.randomAccessFile = new GRandomAccessFile(file, "r");
}
/**
* Constructs a byte provider using the specified file and permissions string
* @param file the file to open for random access
* @param string indicating permissions used for open
* @throws FileNotFoundException if the file does not exist
*/
public GByteProvider(File file, String permissions) throws IOException {
this.file = file;
this.randomAccessFile = new GRandomAccessFile(file, permissions);
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#getFile()
*/
public File getFile() {
return file;
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#getName()
*/
public String getName() {
return file.getName();
}
public String getAbsolutePath() {
return file.getAbsolutePath();
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#getInputStream(long)
*/
public InputStream getInputStream(long index) throws IOException {
FileInputStream is = new FileInputStream(file);
is.skip(index);
return is;
}
/**
* Closes the underlying random-access file.
* @throws IOException if an I/O error occurs
*/
@Override
public void close() throws IOException {
randomAccessFile.close();
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#length()
*/
public long length() throws IOException {
return randomAccessFile.length();
}
public boolean isValidIndex(long index) {
try {
return index >= 0 && index < randomAccessFile.length();
}
catch (IOException e) {
}
return false;
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#readByte(long)
*/
public byte readByte(long index) throws IOException {
randomAccessFile.seek(index);
return randomAccessFile.readByte();
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#readBytes(long, long)
*/
public byte [] readBytes(long index, long length) throws IOException {
randomAccessFile.seek(index);
byte [] b = new byte[(int)length];
int nRead = randomAccessFile.read(b);
if (nRead != length) {
throw new IOException("Unable to read "+length+" bytes");
}
return b;
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#writeByte(long, byte)
*/
public void writeByte(long index, byte value) throws IOException {
randomAccessFile.seek(index);
randomAccessFile.write(value);
}
/**
* @see GByteProvider.app.util.bin.ByteProvider#writeBytes(long, byte[])
*/
public void writeBytes(long index, byte [] values) throws IOException {
randomAccessFile.seek(index);
randomAccessFile.write(values);
}
}

View File

@@ -0,0 +1,136 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
/**
* Helper methods for converting between
* number data types without negative
* promotion.
*
*
*/
public class GConv {
private GConv() {
}
/**
* A byte mask.
*/
public static final int BYTE_MASK = 0xff;
/**
* A short mask.
*/
public static final int SHORT_MASK = 0xffff;
/**
* An integer mask.
*/
public static final long INT_MASK = 0x00000000ffffffffL;
/**
* Converts a byte to a short.
* @param b the byte
* @return the short equivalent of the byte
*/
public static short byteToShort(byte b) {
return (short)(b & BYTE_MASK);
}
/**
* Converts a byte to an integer.
* @param b the byte
* @return the integer equivalent of the byte
*/
public static int byteToInt(byte b) {
return (b & BYTE_MASK);
}
/**
* Converts a byte to a long.
* @param b the byte
* @return the long equivalent of the byte
*/
public static long byteToLong(byte b) {
return intToLong(b & BYTE_MASK);
}
/**
* Converts a short to an integer.
* @param s the short
* @return the integer equivalent of the short
*/
public static int shortToInt(short s) {
return (s & SHORT_MASK);
}
/**
* Converts a short to a long.
* @param s the short
* @return the long eqivalent of the short
*/
public static long shortToLong(short s) {
return intToLong(s & SHORT_MASK);
}
/**
* Converts an integer to a long.
* @param i the integer
* @return the long equivalent of the long
*/
public static long intToLong(int i) {
return (i & INT_MASK);
}
public static String toString(byte [] array) {
StringBuffer buffer = new StringBuffer();
for (byte b : array) {
buffer.append((char)b);
}
return buffer.toString();
}
/**
* Converts a byte into a padded hex string.
* @param b the byte
* @return the padded hex string
*/
public static String toHexString(byte b) {
return zeropad(Integer.toHexString(byteToInt(b)), 2);
}
/**
* Converts a short into a padded hex string.
* @param s the short
* @return the padded hex string
*/
public static String toHexString(short s) {
return zeropad(Integer.toHexString(shortToInt(s)), 4);
}
/**
* Converts an integer into a padded hex string.
* @param i the integer
* @return the padded hex string
*/
public static String toHexString(int i) {
return zeropad(Integer.toHexString(i), 8);
}
/**
* Converts a long into a padded hex string.
* @param l the long
* @return the padded hex string
*/
public static String toHexString(long l) {
return zeropad(Long.toHexString(l), 16);
}
/**
* Returns a string that is extended to
* length len with zeroes.
* @param s The string to pad
* @param len The length of the return string
* @return A string that has been padded to be of legnth len
*/
public static String zeropad(String s, int len) {
if (s == null) s = "";
StringBuffer buffer = new StringBuffer(s);
int zerosNeeded = len - s.length();
for (int i = 0 ; i < zerosNeeded ; ++i) {
buffer.insert(0, '0');
}
return buffer.toString();
}
}

View File

@@ -0,0 +1,218 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
import java.io.Serializable;
/**
*
* Defines methods to convert byte arrays to a specific primitive Java types,
* and to populate byte arrays from primitive Java types.
*
*
*/
public interface GDataConverter extends Serializable {
/**
* Get the short value from the given byte array.
* @param b array containing bytes
* @throws IndexOutOfBoundsException if byte array size is
* less than 2.
*/
public short getShort(byte[] b);
/**
* Get the short value from the given byte array.
* @param b array containing bytes
* @param offset offset into byte array for getting the short
* @throws IndexOutOfBoundsException if byte array size is
* less than offset+2.
*/
public short getShort(byte[] b, int offset);
/**
* Get the int value from the given byte array.
* @param b array containing bytes
* @throws IndexOutOfBoundsException if byte array size is
* less than 4.
*/
public int getInt(byte[] b);
/**
* Get the int value from the given byte array.
* @param b array containing bytes
* @param offset offset into byte array for getting the int
* @throws IndexOutOfBoundsException if byte array size is
* less than offset+4.
*/
public int getInt(byte[] b, int offset);
/**
* Get the long value from the given byte array.
* @param b array containing bytes
* @throws IndexOutOfBoundsException if byte array size is
* less than 8.
*/
public long getLong(byte[] b);
/**
* Get the long value from the given byte array.
* @param b array containing bytes
* @param offset offset into byte array for getting the long
* @throws IndexOutOfBoundsException if byte array size is
* less than offset+8.
*/
public long getLong(byte[] b, int offset);
/**
* Get the value from the given byte array using the specified size.
* @param b array containing bytes
* @param size number of bytes to use from array at offset 0
* @throws IndexOutOfBoundsException if byte array size is
* less than size.
*/
public long getValue(byte[] b, int size);
/**
* Get the value from the given byte array using the specified size.
* @param b array containing bytes
* @param size number of bytes to use from array
* @param offset offset into byte array for getting the long
* @throws IndexOutOfBoundsException if byte array size is
* less than offset+size or size is greater than 8 (sizeof long).
*/
public long getValue(byte[] b, int offset, int size);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @throws IndexOutOfBoundsException if b.length is not at least
* 2.
*/
public void getBytes(short value, byte[] b);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @param offset offset into byte array to put the bytes
* @throws IndexOutOfBoundsException if (offset+2)>b.length
*/
public void getBytes(short value, byte[] b, int offset);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @throws IndexOutOfBoundsException if b.length is not at least
* 4.
*/
public void getBytes(int value, byte[] b);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @param offset offset into byte array to put the bytes
* @throws IndexOutOfBoundsException if (offset+4)>b.length
*/
public void getBytes(int value, byte[] b, int offset);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @throws IndexOutOfBoundsException if b.length is not at least
* 8.
*/
public void getBytes(long value, byte[] b);
/**
* Converts the given value to bytes.
* @param value value to convert to bytes
* @param b byte array to store bytes
* @param offset offset into byte array to put the bytes
* @throws IndexOutOfBoundsException if (offset+8)>b.length
*/
public void getBytes(long value, byte[] b, int offset);
/**
* Converts the given value to bytes using the number of least significant bytes
* specified by size.
* @param value value to convert to bytes
* @param size number of least significant bytes of value to be written to the byte array
* @param b byte array to store bytes
* @param offset offset into byte array to put the bytes
* @throws IndexOutOfBoundsException if (offset+size)>b.length.
*/
public void getBytes(long value, int size, byte[] b, int offset);
/**
* Converts the short value to an array of bytes.
* @param value short value to be converted
* @return array of bytes
*/
public byte[] getBytes(short value);
/**
* Converts the int value to an array of bytes.
* @param value int value to be converted
* @return array of bytes
*/
public byte[] getBytes(int value);
/**
* Converts the long value to an array of bytes.
* @param value long value to be converted
* @return array of bytes
*/
public byte[] getBytes(long value);
/**
* Writes a short value into a byte array.
* @param b array to contain the bytes;
* @param value the short value
*/
public void putShort(byte[] b, short value);
/**
* Writes a short value into the byte array at the given offset
* @param b array to contain the bytes;
* @param offset the offset into the byte array to store the value.
* @param value the short value
*/
public void putShort(byte[] b, int offset, short value);
/**
* Writes a int value into a byte array.
* @param b array to contain the bytes;
* @param value the int value
*/
public void putInt(byte[] b, int value);
/**
* Writes a int value into the byte array at the given offset
* @param b array to contain the bytes;
* @param offset the offset into the byte array to store the value.
* @param value the int value
*/
public void putInt(byte[] b, int offset, int value);
/**
* Writes a long value into a byte array.
* @param b array to contain the bytes;
* @param value the long value
*/
public void putLong(byte[] b, long value);
/**
* Writes a long value into the byte array at the given offset
* @param b array to contain the bytes;
* @param offset the offset into the byte array to store the value.
* @param value the long value
*/
public void putLong(byte[] b, int offset, long value);
}

View File

@@ -0,0 +1,224 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
/**
* Helper class to convert a byte array to Java primitives and primitives to a
* byte array in Big endian.
*
*
*
*/
public class GDataConverterBE implements GDataConverter {
public static final GDataConverterBE INSTANCE = new GDataConverterBE();
/**
*
*/
private static final long serialVersionUID = 1L;
/**
* Constructor for BigEndianDataConverter.
*/
public GDataConverterBE() {
}
/**
* @see GDataConverter#getShort(byte[])
*/
public final short getShort(byte[] b) {
return getShort(b, 0);
}
/**
* @see GDataConverter#getShort(byte[], int)
*/
public short getShort(byte[] b, int offset) {
return (short) (((b[offset] & 0xff) << 8) | (b[offset + 1] & 0xff));
}
/**
* @see GDataConverter#getInt(byte[])
*/
public final int getInt(byte[] b) {
return getInt(b, 0);
}
/**
* @see GDataConverter#getInt(byte[], int)
*/
public int getInt(byte[] b, int offset) {
int v = b[offset];
for (int i = 1; i < 4; i++) {
v = (v << 8) | (b[offset + i] & 0xff);
}
return v;
}
/**
* @see GDataConverter#getLong(byte[])
*/
public final long getLong(byte[] b) {
return getLong(b, 0);
}
/**
* @see GDataConverter#getLong(byte[], int)
*/
public long getLong(byte[] b, int offset) {
long v = b[offset];
for (int i = 1; i < 8; i++) {
v = (v << 8) | (b[offset + i] & 0xff);
}
return v;
}
/**
* @see GDataConverter.util.DataConverter#getValue(byte[], int)
*/
public long getValue(byte[] b, int size) {
return getValue(b, 0, size);
}
/**
* @see GDataConverter.util.DataConverter#getValue(byte[], int, int)
*/
public long getValue(byte[] b, int offset, int size) {
if (size > 8) {
throw new IndexOutOfBoundsException("size exceeds sizeof long: " + size);
}
long val = 0;
for (int i = 0; i < size; i++) {
val = (val << 8) | (b[offset + i] & 0xff);
}
return val;
}
/**
* @see GDataConverter#getBytes(short, byte[])
*/
public final void getBytes(short value, byte[] b) {
getBytes(value, b, 0);
}
/**
* @see GDataConverter#getBytes(short, byte[], int)
*/
public void getBytes(short value, byte[] b, int offset) {
b[offset] = (byte) (value >> 8);
b[offset + 1] = (byte) (value & 0xff);
}
/**
* @see GDataConverter#getBytes(int, byte[])
*/
public final void getBytes(int value, byte[] b) {
getBytes(value, b, 0);
}
/**
* @see GDataConverter#getBytes(int, byte[], int)
*/
public void getBytes(int value, byte[] b, int offset) {
b[offset + 3] = (byte) (value);
for (int i = 2; i >= 0; i--) {
value >>= 8;
b[offset + i] = (byte) (value);
}
}
/**
* @see GDataConverter#getBytes(long, byte[])
*/
public final void getBytes(long value, byte[] b) {
getBytes(value, 8, b, 0);
}
/**
* @see GDataConverter#getBytes(long, byte[], int)
*/
public void getBytes(long value, byte[] b, int offset) {
getBytes(value, 8, b, offset);
}
/**
* @see GDataConverter.util.DataConverter#getBytes(long, int, byte[], int)
*/
public void getBytes(long value, int size, byte[] b, int offset) {
for (int i = size - 1; i >= 0; i--) {
b[offset + i] = (byte) value;
value >>= 8;
}
}
/**
* @see GDataConverter.util.DataConverter#putInt(byte[], int, int)
*/
public final void putInt(byte[] b, int offset, int value) {
getBytes(value, b, offset);
}
/**
* @see GDataConverter.util.DataConverter#putInt(byte[], int)
*/
public final void putInt(byte[] b, int value) {
getBytes(value, b);
}
/**
* @see GDataConverter.util.DataConverter#putLong(byte[], int, long)
*/
public final void putLong(byte[] b, int offset, long value) {
getBytes(value, b, offset);
}
/**
* @see GDataConverter.util.DataConverter#putLong(byte[], long)
*/
public final void putLong(byte[] b, long value) {
getBytes(value, b);
}
/**
* @see GDataConverter.util.DataConverter#putShort(byte[], int, short)
*/
public final void putShort(byte[] b, int offset, short value) {
getBytes(value, b, offset);
}
/**
* @see GDataConverter.util.DataConverter#putShort(byte[], short)
*/
public final void putShort(byte[] b, short value) {
getBytes(value, b);
}
/**
* @see GDataConverter.util.DataConverter#getBytes(int)
*/
public byte[] getBytes(int value) {
byte[] bytes = new byte[4];
getBytes(value, bytes);
return bytes;
}
/**
* @see GDataConverter.util.DataConverter#getBytes(long)
*/
public byte[] getBytes(long value) {
byte[] bytes = new byte[8];
getBytes(value, bytes);
return bytes;
}
/**
* @see GDataConverter.util.DataConverter#getBytes(short)
*/
public byte[] getBytes(short value) {
byte[] bytes = new byte[2];
getBytes(value, bytes);
return bytes;
}
}

View File

@@ -0,0 +1,222 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
/**
*
* Helper class to convert a byte array to a Java primitive in Little endian
* order, and to convert a primitive to a byte array.
*/
public class GDataConverterLE implements GDataConverter {
public static GDataConverterLE INSTANCE = new GDataConverterLE();
/**
*
*/
private static final long serialVersionUID = 1L;
/**
* Constructor for BigEndianDataConverter.
*/
public GDataConverterLE() {
}
/**
* @see GDataConverter#getShort(byte[])
*/
public final short getShort(byte[] b) {
return getShort(b, 0);
}
/**
* @see GDataConverter#getShort(byte[], int)
*/
public short getShort(byte[] b, int offset) {
return (short) (((b[offset + 1] & 0xff) << 8) | (b[offset] & 0xff));
}
/**
* @see GDataConverter#getInt(byte[])
*/
public final int getInt(byte[] b) {
return getInt(b, 0);
}
/**
* @see GDataConverter#getInt(byte[], int)
*/
public int getInt(byte[] b, int offset) {
int v = b[offset + 3];
for (int i = 2; i >= 0; i--) {
v = (v << 8) | (b[offset + i] & 0xff);
}
return v;
}
/**
* @see GDataConverter#getLong(byte[])
*/
public final long getLong(byte[] b) {
return getLong(b, 0);
}
/**
* @see GDataConverter#getLong(byte[], int)
*/
public long getLong(byte[] b, int offset) {
long v = b[offset + 7];
for (int i = 6; i >= 0; i--) {
v = (v << 8) | (b[offset + i] & 0xff);
}
return v;
}
/**
* @see ghidra.util.GDataConverter#getValue(byte[], int)
*/
public long getValue(byte[] b, int size) {
return getValue(b, 0, size);
}
/**
* @see ghidra.util.GDataConverter#getValue(byte[], int, int)
*/
public long getValue(byte[] b, int offset, int size) {
if (size > 8) {
throw new IndexOutOfBoundsException("size exceeds sizeof long: " + size);
}
long val = 0;
for (int i = size - 1; i >= 0; i--) {
val = (val << 8) | (b[offset + i] & 0xff);
}
return val;
}
/**
* @see GDataConverter#getBytes(short, byte[])
*/
public final void getBytes(short value, byte[] b) {
getBytes(value, b, 0);
}
/**
* @see GDataConverter#getBytes(short, byte[], int)
*/
public void getBytes(short value, byte[] b, int offset) {
b[offset + 1] = (byte) (value >> 8);
b[offset] = (byte) (value & 0xff);
}
/**
* @see GDataConverter#getBytes(int, byte[])
*/
public final void getBytes(int value, byte[] b) {
getBytes(value, b, 0);
}
/**
* @see GDataConverter#getBytes(int, byte[], int)
*/
public void getBytes(int value, byte[] b, int offset) {
b[offset] = (byte) (value);
for (int i = 1; i < 4; i++) {
value >>= 8;
b[offset + i] = (byte) (value);
}
}
/**
* @see GDataConverter#getBytes(long, byte[])
*/
public final void getBytes(long value, byte[] b) {
getBytes(value, 8, b, 0);
}
/**
* @see GDataConverter#getBytes(long, byte[], int)
*/
public void getBytes(long value, byte[] b, int offset) {
getBytes(value, 8, b, offset);
}
/**
* @see ghidra.util.GDataConverter#getBytes(long, int, byte[], int)
*/
public void getBytes(long value, int size, byte[] b, int offset) {
for (int i = 0; i < size; i++) {
b[offset + i] = (byte) value;
value >>= 8;
}
}
/**
* @see ghidra.util.GDataConverter#putInt(byte[], int, int)
*/
public final void putInt(byte[] b, int offset, int value) {
getBytes(value, b, offset);
}
/**
* @see ghidra.util.GDataConverter#putInt(byte[], int)
*/
public final void putInt(byte[] b, int value) {
getBytes(value, b);
}
/**
* @see ghidra.util.GDataConverter#putLong(byte[], int, long)
*/
public final void putLong(byte[] b, int offset, long value) {
getBytes(value, b, offset);
}
/**
* @see ghidra.util.GDataConverter#putLong(byte[], long)
*/
public final void putLong(byte[] b, long value) {
getBytes(value, b);
}
/**
* @see ghidra.util.GDataConverter#putShort(byte[], int, short)
*/
public final void putShort(byte[] b, int offset, short value) {
getBytes(value, b, offset);
}
/**
* @see ghidra.util.GDataConverter#putShort(byte[], short)
*/
public final void putShort(byte[] b, short value) {
getBytes(value, b);
}
/**
* @see ghidra.util.GDataConverter#getBytes(int)
*/
public byte[] getBytes(int value) {
byte[] bytes = new byte[4];
getBytes(value, bytes);
return bytes;
}
/**
* @see ghidra.util.GDataConverter#getBytes(long)
*/
public byte[] getBytes(long value) {
byte[] bytes = new byte[8];
getBytes(value, bytes);
return bytes;
}
/**
* @see ghidra.util.GDataConverter#getBytes(short)
*/
public byte[] getBytes(short value) {
byte[] bytes = new byte[2];
getBytes(value, bytes);
return bytes;
}
}

View File

@@ -0,0 +1,63 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
import java.io.*;
public final class GFileUtilityMethods {
private static final String GHIDRA_FILE_SYSTEM_PREFIX = "ghidra_file_system_";
private static final String GHIDRA_FILE_SYSTEM_SUFFIX = ".tmp";
public final static File writeTemporaryFile( InputStream inputStream ) throws IOException {
return writeTemporaryFile( inputStream , Integer.MAX_VALUE );
}
public final static File writeTemporaryFile( InputStream inputStream, int maxBytesToWrite ) throws IOException {
File tempOutputFile = File.createTempFile( GHIDRA_FILE_SYSTEM_PREFIX, GHIDRA_FILE_SYSTEM_SUFFIX );
tempOutputFile.deleteOnExit();
OutputStream outputStream = new FileOutputStream( tempOutputFile );
try {
int nWritten = 0;
byte [] buffer = new byte[ 8192 ];
while ( true ) {
int nRead = inputStream.read( buffer );
if ( nRead == -1 ) {
break;
}
outputStream.write( buffer, 0, nRead );
nWritten += nRead;
if ( nWritten >= maxBytesToWrite ) {
break;
}
}
}
finally {
outputStream.close();
}
return tempOutputFile;
}
public final static File writeTemporaryFile( byte [] bytes, String prefix ) throws IOException {
if ( prefix == null ) {
prefix = GHIDRA_FILE_SYSTEM_PREFIX;
}
if ( prefix.length() < 3 ) {//temp file prefix must be at least 3 chars in length
for ( int i = prefix.length() ; i < 3 ; ++i ) {
prefix = prefix + '_';
}
}
File tempFile = File.createTempFile( prefix , GHIDRA_FILE_SYSTEM_SUFFIX );
tempFile.deleteOnExit();
OutputStream tempFileOut = new FileOutputStream( tempFile );
try {
tempFileOut.write( bytes );
}
finally {
tempFileOut.close();
}
return tempFile;
}
}

View File

@@ -0,0 +1,306 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
import java.io.*;
/**
* Instances of this class support both reading and writing to a
* random access file. A random access file behaves like a large
* array of bytes stored in the file system. There is a kind of cursor,
* or index into the implied array, called the <em>file pointer</em>.
* This implementation relies on java.net.RandomAccessFile,
* but adds buffering to limit the amount.
*/
public class GRandomAccessFile {
private static final byte[] EMPTY = new byte[0];
private static final int BUFFER_SIZE = 0x100000;
private File file;
private RandomAccessFile randomAccessFile;
private byte[] buffer = EMPTY;
private long bufferOffset = 0;
private long bufferFileStartIndex = 0;
private byte[] lastbuffer = EMPTY;
private long lastbufferOffset = 0;
private long lastbufferFileStartIndex = 0;
private boolean open = false;
private void checkOpen() throws IOException {
if (!open) {
throw new IOException("GhidraRandomAccessFile is closed");
}
}
/**
* Creates a random access file stream to read from, and optionally to
* write to, the file specified by the {@link File} argument. A new {@link
* FileDescriptor} object is created to represent this file connection.
*
* <p>
* This implementation relies on java.net.RandomAccessFile,
* but adds buffering to limit the amount.
* <p>
*
* <a name="mode"><p> The <tt>mode</tt> argument specifies the access mode
* in which the file is to be opened. The permitted values and their
* meanings are:
*
* <blockquote><table summary="Access mode permitted values and meanings">
* <tr><th><p align="left">Value</p></th><th><p align="left">Meaning</p></th></tr>
* <tr><td valign="top"><tt>"r"</tt></td>
* <td> Open for reading only. Invoking any of the <tt>write</tt>
* methods of the resulting object will cause an {@link
* java.io.IOException} to be thrown. </td></tr>
* <tr><td valign="top"><tt>"rw"</tt></td>
* <td> Open for reading and writing. If the file does not already
* exist then an attempt will be made to create it. </td></tr>
* <tr><td valign="top"><tt>"rws"</tt></td>
* <td> Open for reading and writing, as with <tt>"rw"</tt>, and also
* require that every update to the file's content or metadata be
* written synchronously to the underlying storage device. </td></tr>
* <tr><td valign="top"><tt>"rwd"&nbsp;&nbsp;</tt></td>
* <td> Open for reading and writing, as with <tt>"rw"</tt>, and also
* require that every update to the file's content be written
* synchronously to the underlying storage device. </td></tr>
* </table></blockquote>
*
* @param file the file object
* @param mode the access mode, as described
* <a href="#mode">above</a>
* @exception IllegalArgumentException if the mode argument is not equal
* to one of <tt>"r"</tt>, <tt>"rw"</tt>, <tt>"rws"</tt>, or
* <tt>"rwd"</tt>
* @exception FileNotFoundException
* that name cannot be created, or if some other error occurs
* while opening or creating the file
*/
public GRandomAccessFile(File file, String mode) throws IOException {
this.file = file;
randomAccessFile = new RandomAccessFile(file, mode);
this.open = true;
}
@Override
protected void finalize() {
if (open) {
//TODO Msg.warn(this, "FAIL TO CLOSE " + file);
}
}
/**
* Closes this random access file stream and releases any system
* resources associated with the stream. A closed random access
* file cannot perform input or output operations and cannot be
* reopened.
* <p>
* If this file has an associated channel then the channel is closed as well.
* @exception IOException if an I/O error occurs.
*/
public void close() throws IOException {
checkOpen();
open = false;
randomAccessFile.close();
}
/**
* Returns the length of this file.
* @return the length of this file, measured in bytes.
* @exception IOException if an I/O error occurs.
*/
public long length() throws IOException {
checkOpen();
return randomAccessFile.length();
}
/**
* Sets the file-pointer offset, measured from the beginning of this
* file, at which the next read or write occurs. The offset may be
* set beyond the end of the file. Setting the offset beyond the end
* of the file does not change the file length. The file length will
* change only by writing after the offset has been set beyond the end
* of the file.
* @param pos the offset position, measured in bytes from the
* beginning of the file, at which to set the file
* pointer.
* @throws IOException
* @exception IOException if <code>pos</code> is less than
* <code>0</code> or if an I/O error occurs.
*/
public void seek(long pos) throws IOException {
checkOpen();
if (pos < 0) {
throw new IOException("pos cannot be less than zero");
}
if (pos < bufferFileStartIndex || pos >= bufferFileStartIndex + BUFFER_SIZE) {
// check if the last buffer contained it, and swap in if necessary
swapInLast();
if (pos < bufferFileStartIndex || pos >= bufferFileStartIndex + BUFFER_SIZE) {
// not in either, gotta get a new one
buffer = EMPTY;
bufferOffset = 0;
bufferFileStartIndex = pos;
}
}
bufferOffset = pos - bufferFileStartIndex;
}
/**
* This method reads a byte from the file, starting from the current file pointer.
* <p>
* This method blocks until the byte is read, the end of the stream
* is detected, or an exception is thrown.
*
* @return the next byte of this file as a signed eight-bit
* <code>byte</code>.
* @exception EOFException if this file has reached the end.
* @exception IOException if an I/O error occurs.
*/
public byte readByte() throws IOException {
checkOpen();
ensure(1);
return buffer[(int) bufferOffset];
}
/**
* Reads up to <code>b.length</code> bytes of data from this file
* into an array of bytes. This method blocks until at least one byte
* of input is available.
*
* @param b the buffer into which the data is read.
* @return the total number of bytes read into the buffer, or
* <code>-1</code> if there is no more data because the end of
* this file has been reached.
* @exception IOException if an I/O error occurs.
*/
public int read(byte[] b) throws IOException {
checkOpen();
return read(b, 0, b.length);
}
/**
* Reads up to <code>len</code> bytes of data from this file into an
* array of bytes. This method blocks until at least one byte of input
* is available.
*
* @param b the buffer into which the data is read.
* @param off the start offset of the data.
* @param len the maximum number of bytes read.
* @return the total number of bytes read into the buffer, or
* <code>-1</code> if there is no more data because the end of
* the file has been reached.
* @exception IOException if an I/O error occurs.
*/
public int read(byte[] b, int offset, int length) throws IOException {
checkOpen();
int readLen = length;
do {
int blocklength = readLen;
if (readLen > (BUFFER_SIZE - bufferOffset)) {
blocklength = (BUFFER_SIZE - (int) bufferOffset);
if (blocklength <= 0) {
blocklength = BUFFER_SIZE;
}
}
ensure(blocklength);
System.arraycopy(buffer, (int) bufferOffset, b, offset, blocklength);
readLen -= blocklength;
offset += blocklength;
if (readLen > 0) {
seek(this.bufferFileStartIndex + bufferOffset + blocklength);
}
}
while (readLen > 0);
return length;
}
/**
* Writes a byte to this file, starting at the current file pointer.
* @param b the data.
* @exception IOException if an I/O error occurs.
*/
public void write(byte b) throws IOException {
checkOpen();
write(new byte[] { b }, 0, 1);
}
/**
* Writes <code>b.length</code> bytes from the specified byte array
* to this file, starting at the current file pointer.
* @param b the data.
* @exception IOException if an I/O error occurs.
*/
public void write(byte[] b) throws IOException {
checkOpen();
write(b, 0, b.length);
}
/**
* Writes a sub array as a sequence of bytes.
* @param b the data to be written
* @param offset the start offset in the data
* @param length the number of bytes that are written
* @exception IOException If an I/O error has occurred.
*/
public void write(byte[] b, int offset, int length) throws IOException {
checkOpen();
randomAccessFile.write(b, offset, length);
buffer = EMPTY;
bufferOffset = 0;
lastbuffer = EMPTY;
lastbufferOffset = 0;
}
/**
* Ensures that enough bytes are cached to
* satisfy the next request to read.
*/
private void ensure(int bytesNeeded) throws IOException {
checkOpen();
long oldFileStartIndex = bufferFileStartIndex;
long oldBufferOffset = bufferOffset;
long oldSeekPos = oldFileStartIndex + oldBufferOffset;
if (bufferOffset + bytesNeeded > buffer.length) {
// check if the last buffer contained it, and swap in if necessary
swapInLast();
// must ensure that current read pos is in old buffer, and enough bytes
long newBufferOffset = (oldSeekPos - bufferFileStartIndex);
if (oldSeekPos < bufferFileStartIndex ||
oldSeekPos >= bufferFileStartIndex + BUFFER_SIZE ||
(newBufferOffset + bytesNeeded > buffer.length)) {
bufferFileStartIndex = oldFileStartIndex + oldBufferOffset;
buffer = new byte[BUFFER_SIZE];
randomAccessFile.seek(bufferFileStartIndex);
randomAccessFile.read(buffer);
bufferOffset = 0;
}
else {
bufferOffset = newBufferOffset;
}
}
}
private void swapInLast() throws IOException {
checkOpen();
if (buffer == EMPTY) {
return;
}
// swap em and return
byte[] swapbuffer = buffer;
long swapbufferOffset = bufferOffset;
long swapbufferFileStartIndex = bufferFileStartIndex;
buffer = lastbuffer;
bufferOffset = lastbufferOffset;
bufferFileStartIndex = lastbufferFileStartIndex;
lastbuffer = swapbuffer;
lastbufferOffset = swapbufferOffset;
lastbufferFileStartIndex = swapbufferFileStartIndex;
}
}

View File

@@ -0,0 +1,58 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
/**
* Class with static methods that deal with string manipulation.
*/
public class GStringUtilities {
/**
* Converts an integer into a string.
* For example, given an integer 0x41424344,
* the returned string would be "ABCD".
* @param value the integer value
* @return the converted string
*/
public static String toString(int value) {
byte[] bytes = new byte[4];
int byteIndex = bytes.length - 1;
while (value != 0) {
bytes[byteIndex] = (byte) value;
value = value >> 8;
--byteIndex;
}
return new String(bytes);
}
public static String convertBytesToString( byte [] bytes, int length ) {
StringBuffer buf = new StringBuffer( length * 2);
for ( int i = 0 ; i < length ; ++i ) {
String bs = Integer.toHexString( bytes[ i ] & 0xff );
if ( bs.length() == 1 ) {
buf.append( "0" );
}
buf.append( bs );
}
return buf.toString();
}
public static byte[] convertStringToBytes(String hexstr) {
try {
byte[] bytes = new byte[hexstr.length() / 2];
for (int i = 0; i < hexstr.length(); i += 2) {
String bs = hexstr.substring(i, i + 2);
bytes[i / 2] = (byte) Integer.parseInt(bs, 16);
}
return bytes;
}
catch (Exception e) {
// tried, but failed
}
return null;
}
}

View File

@@ -0,0 +1,16 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.ghidra;
public class GSystemUtilities {
public static boolean isEqual(Object o1, Object o2) {
if (o1 == null) {
return (o2 == null);
}
return o1.equals(o2);
}
}

View File

@@ -0,0 +1,125 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.hfsplus;
import java.io.*;
import java.util.HashMap;
import java.util.Map;
import org.catacombae.hfsexplorer.fs.ImplHFSXFileSystemView;
import org.catacombae.hfsexplorer.fs.NullProgressMonitor;
import org.catacombae.hfsexplorer.types.hfscommon.CommonHFSCatalogFile;
import org.catacombae.hfsexplorer.types.hfscommon.CommonHFSForkData;
import org.catacombae.hfsexplorer.types.hfsplus.*;
import org.catacombae.jparted.lib.fs.FSFile;
import org.catacombae.jparted.lib.fs.hfscommon.HFSCommonFSFile;
import org.catacombae.jparted.lib.fs.hfsx.HFSXFileSystemHandler;
import mobiledevices.dmg.btree.*;
import mobiledevices.dmg.decmpfs.DecmpfsHeader;
import mobiledevices.dmg.ghidra.GBinaryReader;
import mobiledevices.dmg.ghidra.GByteProvider;
/**
* This code will extract the attributes file from the HFS+ file system,
* which contains the B-tree for traversing the DECOMPFS files.
*/
public class AttributesFileParser {
private Map<FSFile, DecmpfsHeader> map = new HashMap<FSFile, DecmpfsHeader>();
private GByteProvider provider;
private BTreeRootNodeDescriptor root;
public AttributesFileParser( HFSXFileSystemHandler handler, String prefix ) throws IOException {
ImplHFSXFileSystemView hfsxFileSystemView = (ImplHFSXFileSystemView) handler.getFSView();
HFSPlusVolumeHeader volumeHeader = hfsxFileSystemView.getHFSPlusVolumeHeader();
HFSPlusForkData attributes = volumeHeader.getAttributesFile();
File attributesFile = writeVolumeHeaderFile( hfsxFileSystemView, attributes, prefix + "_" + "attributesFile" );
provider = new GByteProvider( attributesFile );
if ( attributesFile.length() == 0 ) {
return;
}
GBinaryReader reader = new GBinaryReader( provider, false );
root = new BTreeRootNodeDescriptor( reader );
}
public void dispose() throws IOException {
map.clear();
provider.close();
}
private int getFileID(FSFile file) {
try {
HFSCommonFSFile hfsFile = (HFSCommonFSFile)file;
CommonHFSCatalogFile catalogFile = hfsFile.getInternalCatalogFile();
CommonHFSCatalogFile.HFSPlusImplementation hfsPlusCatalogFile = (CommonHFSCatalogFile.HFSPlusImplementation)catalogFile;
HFSPlusCatalogFile underlying = hfsPlusCatalogFile.getUnderlying();
HFSCatalogNodeID fileID = underlying.getFileID();
return fileID.toInt();
}
catch (Exception e) {
return -1;
}
}
private File writeVolumeHeaderFile( ImplHFSXFileSystemView hfsxFileSystemView,
HFSPlusForkData volumeHeaderFile,
String volumeHeaderFileName ) throws IOException {
if (volumeHeaderFile == null) {
return null;
}
File file = File.createTempFile( "Ghidra_" + volumeHeaderFileName + "_", ".tmp" );
file.deleteOnExit();
OutputStream out = new FileOutputStream( file );
try {
CommonHFSForkData fork = CommonHFSForkData.create( volumeHeaderFile );
hfsxFileSystemView.extractForkToStream( fork, fork.getBasicExtents(), out, new NullProgressMonitor() {} );
}
finally {
out.close();
}
return file;
}
public DecmpfsHeader getDecmpfsHeader(FSFile file) throws IOException {
if ( root == null ) {
return null;
}
if ( map.get( file ) != null ) {
return map.get( file );
}
int fileID = getFileID( file );
if ( fileID == -1 ) {
return null;
}
for ( BTreeNodeDescriptor node : root.getNodes() ) {
for ( BTreeNodeRecord record : node.getRecords() ) {
if ( record.getFileID() == fileID ) {
DecmpfsHeader header = record.getDecmpfsHeader();
if ( header != null ) {
map.put( file, header );
return header;
}
}
}
}
return null;
}
}

View File

@@ -0,0 +1,344 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.reader;
import java.io.*;
import java.util.ArrayList;
import java.util.List;
import org.catacombae.dmgextractor.encodings.encrypted.ReadableCEncryptedEncodingStream;
import org.catacombae.hfsexplorer.FileSystemRecognizer;
import org.catacombae.hfsexplorer.PartitionSystemRecognizer;
import org.catacombae.hfsexplorer.UDIFRecognizer;
import org.catacombae.hfsexplorer.partitioning.Partition;
import org.catacombae.hfsexplorer.partitioning.PartitionSystem;
import org.catacombae.hfsexplorer.win32.WindowsLowLevelIO;
import org.catacombae.io.*;
import org.catacombae.jparted.lib.DataLocator;
import org.catacombae.jparted.lib.ReadableStreamDataLocator;
import org.catacombae.jparted.lib.fs.*;
import org.catacombae.jparted.lib.fs.FileSystemHandlerFactory.StandardAttribute;
import org.catacombae.jparted.lib.fs.hfsx.HFSXFileSystemHandler;
import org.catacombae.udif.UDIFFile;
import org.catacombae.udif.UDIFRandomAccessStream;
import mobiledevices.dmg.decmpfs.DecmpfsCompressionTypes;
import mobiledevices.dmg.decmpfs.DecmpfsHeader;
import mobiledevices.dmg.ghidra.*;
import mobiledevices.dmg.hfsplus.AttributesFileParser;
import mobiledevices.dmg.zlib.ZLIB;
public class DmgFileReader implements Closeable {
private final static GDataConverter ledc = new GDataConverterLE();
private final static GDataConverter bedc = new GDataConverterBE();
private GByteProvider provider;
private AttributesFileParser parser;
private ReadableRandomAccessStream rras;
private List<FSFolder> rootFolders = new ArrayList<FSFolder>();
private List<FileSystemHandler> fileSystemHandlers = new ArrayList<FileSystemHandler>();
public DmgFileReader( GByteProvider provider ) {
this.provider = provider;
}
public void open() throws IOException {
File file = provider.getFile();
if (WindowsLowLevelIO.isSystemSupported()) {
rras = new WindowsLowLevelIO(file.getAbsolutePath());
}
else {
rras = new ReadableFileStream(file.getAbsolutePath());
}
if (ReadableCEncryptedEncodingStream.isCEncryptedEncoding(rras)) {
//TODO use our decryption instead??
}
System.err.println("Trying to detect UDIF structure...");
if (UDIFRecognizer.isUDIF(rras)) {
System.err.println("UDIF structure found! Creating filter stream...");
UDIFFile udifFile = new UDIFFile(new ReadableFileStream(file.getAbsolutePath()));
debug(udifFile.getView().getPlistData(), "dmg-xml");
UDIFRandomAccessStream stream = new UDIFRandomAccessStream(rras);
rras = stream;
}
else {
System.err.println("UDIF structure not found. Proceeding...");
}
PartitionSystemRecognizer partitionSystemRecognizer = new PartitionSystemRecognizer(rras);
PartitionSystem partitionSystem = partitionSystemRecognizer.getPartitionSystem();
if (partitionSystem == null) {
throw new IOException("No system partitions found. Perhaps the decryption failed?");
}
Partition[] partitions = partitionSystem.getUsedPartitionEntries();
for (Partition partition : partitions) {
openPartition(partition);
}
}
private void debug( byte [] plistData, String fileName ) {
// TODO Auto-generated method stub
}
private void openPartition( Partition selectedPartition ) throws IOException {
long fsOffset = selectedPartition.getStartOffset();//getPmPyPartStart()+selectedPartition.getPmLgDataStart())*blockSize;
long fsLength = selectedPartition.getLength();//getPmDataCnt()*blockSize;
FileSystemRecognizer fsr = new FileSystemRecognizer( rras, fsOffset );
FileSystemRecognizer.FileSystemType fsType = fsr.detectFileSystem();
if ( fsType == FileSystemRecognizer.FileSystemType.HFS_PLUS ||
fsType == FileSystemRecognizer.FileSystemType.HFSX ||
fsType == FileSystemRecognizer.FileSystemType.HFS ) {
final FileSystemMajorType fsMajorType;
switch ( fsType ) {
case HFS:
fsMajorType = FileSystemMajorType.APPLE_HFS;
break;
case HFS_PLUS:
fsMajorType = FileSystemMajorType.APPLE_HFS_PLUS;
break;
case HFSX:
fsMajorType = FileSystemMajorType.APPLE_HFSX;
break;
default:
fsMajorType = null;
break;
}
FileSystemHandlerFactory factory = fsMajorType.createDefaultHandlerFactory();
if ( factory.isSupported( StandardAttribute.CACHING_ENABLED ) ) {
factory.getCreateAttributes().
setBooleanAttribute(StandardAttribute.CACHING_ENABLED,
true);
}
ReadableRandomAccessStream stage1;
if (fsLength > 0) {
stage1 = new ReadableConcatenatedStream(rras, fsOffset, fsLength);
}
else {
stage1 = rras;
}
DataLocator dataLocator = new ReadableStreamDataLocator(stage1);
FileSystemHandler fileSystemHandler = factory.createHandler(dataLocator);
fileSystemHandlers.add( fileSystemHandler );
rootFolders.add( fileSystemHandler.getRoot() );
if ( fileSystemHandler instanceof HFSXFileSystemHandler ) {
parser = new AttributesFileParser( (HFSXFileSystemHandler)fileSystemHandler, fileSystemHandler.getRoot( ).getName( ) );
}
} else {
System.err.println("UNKNOWN file system type. Can't Open filesystem. Suspect this is an APFS.\n");
}
}
@Override
public void close() throws IOException {
try {
rras.close();
}
catch (Exception e) {
//ignore
}
if ( parser != null ) {
parser.dispose();
parser = null;
}
fileSystemHandlers.clear();
rootFolders.clear();
}
public InputStream getData( FSEntry entry ) throws IOException {
if ( entry != null && entry.isFile() ) {
FSFile fsFile = (FSFile)entry;
FSFork mainFork = fsFile.getMainFork();
if ( mainFork.getLength() > 0 ) {
ReadableRandomAccessStream mainForkStream = mainFork.getReadableRandomAccessStream();
if ( mainForkStream.length() != 0 ) {
return new DmgInputStream( mainForkStream );
}
}
else if ( mainFork.getLength() == 0 ) {
FSFork resourceFork = fsFile.getForkByType( FSForkType.MACOS_RESOURCE );
ReadableRandomAccessStream resourceForkStream = resourceFork.getReadableRandomAccessStream();
if ( parser == null ) {
return null;
}
DecmpfsHeader decmpfsHeader = parser.getDecmpfsHeader( fsFile );
if ( decmpfsHeader == null ) {
return null;
}
if ( decmpfsHeader.getCompressionType() == DecmpfsCompressionTypes.CMP_Type3 ) {
if ( decmpfsHeader.getAttrBytes()[ 0 ] == -1 ) {
return new ByteArrayInputStream( decmpfsHeader.getAttrBytes(), 1, decmpfsHeader.getAttrBytes().length - 1 );
}
ZLIB zlib = new ZLIB();
InputStream inputStream =
new ByteArrayInputStream(decmpfsHeader.getAttrBytes());
ByteArrayOutputStream uncompressedBytes =
zlib.decompress(inputStream, (int) decmpfsHeader.getUncompressedSize());
File tempDecompressedFile = GFileUtilityMethods.writeTemporaryFile(
uncompressedBytes.toByteArray(), entry.getName());
return new FileInputStream(tempDecompressedFile);
}
else if ( decmpfsHeader.getCompressionType() == DecmpfsCompressionTypes.CMP_Type4 ) {
return decompressResourceFork( entry, resourceForkStream, (int)decmpfsHeader.getUncompressedSize() );
}
}
}
return null;
}
private InputStream decompressResourceFork( FSEntry entry,
ReadableRandomAccessStream resourceForkStream,
int expectedLength ) throws IOException {
File tempFile = GFileUtilityMethods.writeTemporaryFile( new DmgInputStream( resourceForkStream ) );
System.err.println(
"dmg resource fork for " + entry.getName() + ": " + tempFile.getAbsolutePath());
InputStream input = new FileInputStream( tempFile );
for ( int i = 0 ; i < 0x100 ; ++i ) {
input.read();
}
byte [] sizeBytes = new byte[ 4 ];
input.read( sizeBytes );
int size = sizeBytes[ 0 ] == 0 ?
bedc.getInt( sizeBytes ) :
ledc.getInt( sizeBytes );
byte [] flagsBytes = new byte[ 4 ];
input.read( flagsBytes );
byte [] startDistanceBytes = new byte[ 4 ];
input.read( startDistanceBytes );
int startDistance = ledc.getInt( startDistanceBytes );
input.skip( startDistance - 8 );//skip to the start of the zlib compressed file
File tempCompressedFile = GFileUtilityMethods.writeTemporaryFile( input, size - startDistance );
InputStream inputStream = new FileInputStream( tempCompressedFile );
ZLIB zlib = new ZLIB( );
ByteArrayOutputStream uncompressedByteStream = zlib.decompress( inputStream, expectedLength );
return new ByteArrayInputStream( uncompressedByteStream.toByteArray() );
}
public List<String> getInfo( String path ) {
if ( path != null ) {
DmgInfoGenerator info = new DmgInfoGenerator( this, path, parser );
return info.getInformation( );
}
return null;
}
public List<FSEntry> getListing( String path ) {
List<FSEntry> list = new ArrayList<FSEntry>();
if ( path == null || path.equals( "/" ) ) {
for ( FileSystemHandler handler : fileSystemHandlers ) {
list.add( handler.getRoot() );
}
}
else {
FSEntry fileByPath = getFileByPath( path );
if ( fileByPath != null ) {
if ( fileByPath.isFolder() ) {
FSEntry [] listEntries = fileByPath.asFolder().listEntries();
for ( FSEntry entry : listEntries ) {
list.add( entry );
}
}
}
}
return list;
}
/**
* Returns the length of the given file system entry.
* If the entry is actually a directory, then -1 is returned.
*/
public long getLength( FSEntry entry ) {
if ( entry != null & entry.isFile() ) {
FSFork mainFork = entry.asFile().getMainFork();
if ( mainFork.getLength() > 0 ) {
return mainFork.getLength();
}
try {
if (parser != null) {
DecmpfsHeader header = parser.getDecmpfsHeader(entry.asFile());
if (header != null) {
return header.getUncompressedSize();
}
}
}
catch (IOException e) {
return 1;//TODO lookup valid length in DECMPFS
}
}
return -1;
}
/**
* Convert path to string array.
*
* For example, "/a/b/c.txt" will be converted to [ "a", "b", "c.txt" ].
*
* Note: the "a" will be stripped because it corresponds to the file system handler.
*/
public String [] convertPathToArrayAndStripFileSystemName( String path ) {
String [] splitPath = path.split( "/" );
if ( splitPath.length <= 2 ) {
return new String[ 0 ];
}
String [] temp = new String[ splitPath.length - 2 ];
System.arraycopy( splitPath, 2, temp, 0, splitPath.length - 2 );
return temp;
}
/**
* Returns the DMG file object for the corresponding path.
* Path should contain the file system handler name.
*/
public FSEntry getFileByPath( String path ) {
if ( path == null || path.equals( "/" ) ) {//ROOT
return null;
}
for ( FileSystemHandler handler : fileSystemHandlers ) {
FSEntry entry = handler.getEntry( convertPathToArrayAndStripFileSystemName( path ) );
if ( entry != null ) {
return entry;
}
}
return null;
}
}

View File

@@ -0,0 +1,223 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.reader;
import java.io.IOException;
import java.text.DateFormat;
import java.util.ArrayList;
import java.util.List;
import org.catacombae.hfsexplorer.ObjectContainer;
import org.catacombae.hfsexplorer.types.hfscommon.CommonHFSCatalogFile;
import org.catacombae.hfsexplorer.types.hfsplus.HFSCatalogNodeID;
import org.catacombae.hfsexplorer.types.hfsplus.HFSPlusCatalogFile;
import org.catacombae.jparted.lib.fs.*;
import org.catacombae.jparted.lib.fs.FSAttributes.POSIXFileAttributes;
import org.catacombae.jparted.lib.fs.hfscommon.HFSCommonFSFile;
import mobiledevices.dmg.decmpfs.DecmpfsHeader;
import mobiledevices.dmg.hfsplus.AttributesFileParser;
/**
*
* @see org.catacombae.hfsexplorer.gui.FSEntrySummaryPanel
*
*/
class DmgInfoGenerator {
private DmgFileReader fileSystem;
private String filePath;
private AttributesFileParser parser;
private FSEntry entry;
private DateFormat df = DateFormat.getDateTimeInstance(DateFormat.SHORT, DateFormat.MEDIUM);
DmgInfoGenerator(DmgFileReader fileSystem, String filePath, AttributesFileParser parser) {
this.fileSystem = fileSystem;
this.filePath = filePath;
this.parser = parser;
this.entry = fileSystem.getFileByPath(filePath);
}
List<String> getInformation() {
List<String> infoList = new ArrayList<String>();
if (entry == null) {
infoList.add("<< no information available >>");
return infoList;
}
infoList.add("Name: " + entry.getName());
if (entry instanceof FSFile) {
FSFile file = (FSFile) entry;
infoList.add("Type: " + "File");
infoList.add("Total Size: " + getSizeString(file.getCombinedLength()));
FSFork[] allForks = file.getAllForks();
for (FSFork fork : allForks) {
infoList.add(
" " + fork.getForkIdentifier() + ": " + getSizeString(fork.getLength()));
}
appendFileID(infoList, file);
if (parser != null) {
try {
DecmpfsHeader decmpfsHeader = parser.getDecmpfsHeader(file);
if (decmpfsHeader != null) {
infoList.add(
"Decmpfs Size: " + getSizeString(decmpfsHeader.getUncompressedSize()));
}
}
catch (IOException e) {
}
}
}
else if (entry instanceof FSFolder) {
FSFolder folder = (FSFolder) entry;
infoList.add("Type: " + "Folder");
infoList.add("Size: " + startFolderSizeCalculation(folder));
}
else if (entry instanceof FSLink) {
FSLink link = (FSLink) entry;
FSEntry linkTarget =
link.getLinkTarget(fileSystem.convertPathToArrayAndStripFileSystemName(filePath));
if (linkTarget == null) {
infoList.add("Type: " + "Symbolic link (broken)");
infoList.add("Size: " + "- (broken link)");
}
else if (linkTarget instanceof FSFile) {
FSFile file = (FSFile) linkTarget;
infoList.add("Type: " + "Symbolic link (file)");
infoList.add("Size: " + getSizeString(file.getMainFork().getLength()));
FSFork[] allForks = file.getAllForks();
for (FSFork fork : allForks) {
infoList.add(
" " + fork.getForkIdentifier() + ": " + getSizeString(fork.getLength()));
}
}
else if (linkTarget instanceof FSFolder) {
FSFolder folder = (FSFolder) linkTarget;
infoList.add("Type: " + "Symbolic link (folder)");
infoList.add("Size: " + startFolderSizeCalculation(folder));
}
else {
infoList.add("Type: " + "Symbolic link (unknown [" + linkTarget.getClass() + "])");
infoList.add("Size: " + "- (unknown type)");
}
infoList.add("Link Target: " + link.getLinkTargetString());
}
else {
infoList.add("Type: " + "Unknown [" + entry.getClass() + "]");
infoList.add("Size: " + "- (unknown size)");
}
FSAttributes attrs = entry.getAttributes();
appendDateInformation(attrs, infoList);
appendPosixInformation(attrs, infoList);
appendWindowsInformation(attrs, infoList);
return infoList;
}
private void appendFileID(List<String> infoList, FSFile file) {
try {
HFSCommonFSFile hfsFile = (HFSCommonFSFile) file;
CommonHFSCatalogFile catalogFile = hfsFile.getInternalCatalogFile();
CommonHFSCatalogFile.HFSPlusImplementation hfsPlusCatalogFile =
(CommonHFSCatalogFile.HFSPlusImplementation) catalogFile;
HFSPlusCatalogFile underlying = hfsPlusCatalogFile.getUnderlying();
HFSCatalogNodeID fileID = underlying.getFileID();
infoList.add("File ID: 0x" + Integer.toHexString(fileID.toInt()));
}
catch (Exception e) {
infoList.add("Unable to obtain file ID. " + e.getMessage());
}
}
private void appendPosixInformation(FSAttributes attrs, List<String> infoList) {
if (attrs.hasPOSIXFileAttributes()) {
POSIXFileAttributes posixAttrs = attrs.getPOSIXFileAttributes();
infoList.add("Permissions: " + posixAttrs.getPermissionString());
infoList.add("User ID: " + posixAttrs.getUserID());
infoList.add("Group ID: " + posixAttrs.getGroupID());
}
}
private void appendWindowsInformation(FSAttributes attrs, List<String> infoList) {
if (attrs.hasWindowsFileAttributes()) {
WindowsFileAttributes windowsFileAttributes = attrs.getWindowsFileAttributes();
infoList.add("Archive: " + windowsFileAttributes.isArchive());
infoList.add("Compressed: " + windowsFileAttributes.isCompressed());
infoList.add("Directory: " + windowsFileAttributes.isDirectory());
infoList.add("Encrypted: " + windowsFileAttributes.isEncrypted());
infoList.add("Hidden: " + windowsFileAttributes.isHidden());
infoList.add("Normal: " + windowsFileAttributes.isNormal());
infoList.add("Off-line: " + windowsFileAttributes.isOffline());
infoList.add("Read-only: " + windowsFileAttributes.isReadOnly());
infoList.add("Reparse: " + windowsFileAttributes.isReparsePoint());
infoList.add("Sparse: " + windowsFileAttributes.isSparseFile());
infoList.add("System: " + windowsFileAttributes.isSystem());
infoList.add("Temp: " + windowsFileAttributes.isTemporary());
infoList.add("Virtual: " + windowsFileAttributes.isVirtual());
}
}
private void appendDateInformation(FSAttributes attributes, List<String> infoList) {
if (attributes.hasCreateDate()) {
infoList.add("Created: " + df.format(attributes.getCreateDate()));
}
if (attributes.hasModifyDate()) {
infoList.add("Contents Modified: " + df.format(attributes.getModifyDate()));
}
if (attributes.hasAttributeModifyDate()) {
infoList.add("Attributes Modified: " + df.format(attributes.getAttributeModifyDate()));
}
if (attributes.hasAccessDate()) {
infoList.add("Last Accessed: " + df.format(attributes.getAccessDate()));
}
if (attributes.hasBackupDate()) {
infoList.add("Last Backup: " + df.format(attributes.getBackupDate()));
}
}
private String getSizeString(long result) {
String baseString = Long.toString(result);
return baseString + " bytes";
}
private String startFolderSizeCalculation(FSFolder folder) {
String resultString;
try {
ObjectContainer<Long> result = new ObjectContainer<Long>((long) 0);
calculateFolderSize(folder, result);
resultString = getSizeString(result.o);
}
catch (Exception e) {
e.printStackTrace();
resultString = "Exception while calculating! See debug console for info...";
}
return resultString;
}
private void calculateFolderSize(FSFolder folder, ObjectContainer<Long> result) {
for (FSEntry entry : folder.listEntries()) {
if (entry instanceof FSFile) {
Long value = result.o;
value += ((FSFile) entry).getMainFork().getLength();
result.o = value;
}
else if (entry instanceof FSFolder) {
calculateFolderSize((FSFolder) entry, result);
}
else if (entry instanceof FSLink) {
/* Do nothing. Symbolic link targets aren't part of the folder. */
}
else {
System.err.println("FSEntrySummaryPanel.calculateFolderSize():" +
" unexpected type " + entry.getClass());
}
}
}
}

View File

@@ -0,0 +1,43 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.reader;
import java.io.IOException;
import java.io.InputStream;
import org.catacombae.io.ReadableRandomAccessStream;
/**
* A class to wrap a ReadableRandomAccessStream
* so it may be used as a conventional
* input stream.
*/
public class DmgInputStream extends InputStream {
private ReadableRandomAccessStream stream;
DmgInputStream(ReadableRandomAccessStream stream) {
this.stream = stream;
}
public long getLength() {
return stream.length();
}
@Override
public int read() throws IOException {
return stream.read();
}
@Override
public int read(byte [] b) throws IOException {
return stream.read(b);
}
@Override
public int read(byte [] b, int off, int len) throws IOException {
return stream.read(b, off, len);
}
}

View File

@@ -0,0 +1,193 @@
/* ###
* IP: Public Domain
*/
package mobiledevices.dmg.server;
import java.io.*;
import java.util.List;
import org.catacombae.jparted.lib.fs.*;
import mobiledevices.dmg.ghidra.GByteProvider;
import mobiledevices.dmg.ghidra.GFileUtilityMethods;
import mobiledevices.dmg.reader.DmgFileReader;
public class DmgServer {
private static void writeln(String s) {
StringBuilder encoded = new StringBuilder();
char[] charArray = s.toCharArray();
for (char c : charArray) {
if (c == 11) {
encoded.append(c);// tab
}
if (c <= 31 || c == 127) {
continue;// control characters
}
encoded.append(c);
}
System.out.println(encoded.toString());
}
public static void sendResponse(String s) {
System.out.println(s);
System.out.flush();
}
public static void sendResponses(String... responseStrs) {
for (String s : responseStrs) {
System.out.println(s);
}
System.out.flush();
}
public static void log(String... logstrs) {
for (String s : logstrs) {
System.err.println(s);
}
System.err.flush();
}
public static void main(String[] args) {
log("Waiting for client to connect to DMG server...");
BufferedReader inputReader = new BufferedReader(new InputStreamReader(System.in));
try {
String openLine = inputReader.readLine();
if (openLine == null) {
return;
}
if (!openLine.startsWith("open ")) {
return;//TODO handle invalid initial command???
}
String openPath = parseLine(openLine);
File openFile = new File(openPath);
if (!openFile.exists()) {//TODO handle files that do not exist
}
try (GByteProvider provider = new GByteProvider(openFile);
DmgFileReader dmgFileReader = new DmgFileReader(provider);) {
dmgFileReader.open();
while (true) {
String line = inputReader.readLine();
if (line == null) {
break;
}
String[] parts = line.split(" ", 2);
if (parts.length < 1)
continue;
String cmd = parts[0];
switch (cmd) {
case "close":
log("Exiting DMG server process: close cmd");
return;
case "get_listing": {
String path = parseLine(line);
List<FSEntry> listing = dmgFileReader.getListing(path);
sendResponse("" + listing.size());//write total number of children
for (FSEntry childEntry : listing) {
// send 3 responses: name, isfolder boolean, file length
writeln(childEntry.getName());//write name of each child
sendResponses("" + childEntry.isFolder(),
"" + dmgFileReader.getLength(childEntry));
}
}
break;
case "get_info": {
String path = parseLine(line);
List<String> infoList = dmgFileReader.getInfo(path);
sendResponse("" + infoList.size());//write total number of info lines
for (String info : infoList) {
sendResponse(info);//write each info line
}
}
break;
case "get_data": {
String path = parseLine(line);
FSFile dmgFile = toFile(dmgFileReader, path);
if (dmgFile == null) {//TODO not a valid file...
sendResponse("");
}
else {
long expectedFileLength = dmgFileReader.getLength(dmgFile);
try (InputStream inputStream = dmgFileReader.getData(dmgFile)) {
if (inputStream != null) {
File temporaryFile =
GFileUtilityMethods.writeTemporaryFile(inputStream);
sendResponse(temporaryFile.getAbsolutePath());
if (expectedFileLength != temporaryFile.length()) {
log("file sizes do not match!");
}
}
else {
sendResponse("");// TODO: is this correct way to respond when error cond?
log("No data stream for get_data for " + path);
}
}
}
}
break;
}
}
}
}
catch (IOException e) {
log("IOException error in DMGServer command processing: " + e.getMessage());
e.printStackTrace(System.err);
}
finally {
log("DMG server has terminated.");
}
}
private static FSFile toFile(DmgFileReader dmgFileReader, String path) {
FSEntry entry = dmgFileReader.getFileByPath(path);
if (entry == null) {
//System.err.println("Bad path for toFile: " + path);
return null;
}
if (entry.isFile()) {
return entry.asFile();
}
else if (entry instanceof FSLink) {
int limit = 0;
while (limit++ < 10) {
FSLink link = (FSLink) entry;
FSEntry linkTarget = link.getLinkTarget(
dmgFileReader.convertPathToArrayAndStripFileSystemName(path));
if (linkTarget instanceof FSFile) {
return linkTarget.asFile();
}
else if (linkTarget instanceof FSLink) {
entry = linkTarget;
}
else {//anything else just return
break;
}
}
}
return null;
}
private static String parseLine(String openLine) {
int space = openLine.indexOf(' ');
String path = openLine.substring(space + 1).trim();
return path;
}
}

Some files were not shown because too many files have changed in this diff Show More