GP-4009 Introduced BSim functionality including support for postgresql,

elasticsearch and h2 databases.  Added BSim correlator to Version
Tracking.
This commit is contained in:
caheckman
2023-11-17 01:13:42 +00:00
committed by ghidra1
parent f0f5b8f2a4
commit 0865a3dfb0
509 changed files with 77125 additions and 934 deletions
@@ -1,8 +1,6 @@
##VERSION: 2.0
##MODULE IP: Apache License 2.0
##MODULE IP: Apache License 2.0 with LLVM Exceptions
.classpath||NONE||reviewed||END|
.project||NONE||reviewed||END|
FridaNotes.txt||GHIDRA||||END|
Module.manifest||GHIDRA||||END|
build.gradle||GHIDRA||||END|
@@ -1,8 +1,6 @@
##VERSION: 2.0
##MODULE IP: Apache License 2.0
##MODULE IP: Apache License 2.0 with LLVM Exceptions
.classpath||NONE||reviewed||END|
.project||NONE||reviewed||END|
Module.manifest||GHIDRA||||END|
build.gradle||GHIDRA||||END|
src/llvm-project/lldb/bindings/java/java-typemaps.swig||Apache License 2.0 with LLVM Exceptions||||END|
@@ -1,8 +1,6 @@
##VERSION: 2.0
##MODULE IP: Apache License 2.0
##MODULE IP: Apache License 2.0 with LLVM Exceptions
.classpath||NONE||reviewed||END|
.project||NONE||reviewed||END|
InstructionsForBuildingLLDBInterface.txt||GHIDRA||||END|
Module.manifest||GHIDRA||||END|
build.gradle||GHIDRA||||END|
+81
View File
@@ -0,0 +1,81 @@
Installation of the Elasticsearch BSim Plug-in:
In order to use Elasticsearch as the back-end database for a BSim instance,
the lsh plug-in, included with this Ghidra extension, must be installed on
the Elasticsearch cluster.
The lsh plug-in is bundled in the standard plug-in format as the file
'lsh.zip'. It must be installed separately on EVERY node of the cluster,
and each node must be restarted after the install in order for the plug-in to
become active.
For a single node, installation is accomplished with the command-line
'elasticsearch-plugin' script that comes with the standard Elasticsearch
distribution. It expects a URL pointing to the plug-in to be installed.
The basic command, executed in the Elasticsearch installation directory
for the node, is
bin/elasticsearch-plugin install file:///path/to/ghidra/Ghidra/Extensions/BSimElasticPlugin/data/lsh.zip
Replace the initial portion of the absolute path in the URL to point to your
particular Ghidra installation.
Deployment:
Follow the Elasticsearch documentation to do any additional configuration,
starting, stopping, and management of your Elasticsearch cluster.
To try BSim with a toy deployment, you can start a single node (as per the
documentation) from the command-line by just running
bin/elasticsearch
This will dump logging messages to the console, and you should see '[lsh]'
listed among the loaded plug-ins as the node starts up.
Once the Elasticsearch node(s) are running, whether they are a toy or a full
deployment, you can immediately proceed to the BSim 'bsim' command.
The Ghidra/BSim client and 'bsim' command automatically assume an
Elasticsearch server when they see the 'https' protocol in the provided URLs,
although the 'elastic" protocol may also be specified and is equivalent.
The use of the 'http' protocol for Elasticsearch is not supported.
Adjust the hostname, port number, and repository name as appropriate.
Use a command-line similar to the following to create a BSim instance:
bsim createdatabase elastic://1.2.3.4:9200/repo medium_32
This is equivalent to:
bsim createdatabase https://1.2.3.4:9200/repo medium_32
Use a command-line like this to generate and commit signatures from a Ghidra Server
repository to the Elasticsearch database created above:
bsim generatesigs ghidra://1.2.3.4/repo bsim=elastic://1.2.3.4:9200/repo
Within Ghidra's BSim client, enter the same URL into the database connection
panel in order to place queries to your Elasticsearch deployment. See the BSim
documentation included with Ghidra for full details.
Version:
The current BSim plug-in was designed and tested with Elasticsearch version 7.17.4.
A change to the Elasticsearch scripting interface, starting with version 7.15, makes the BSim
plug-in incompatible with previous versions, but the lsh plug-in jars may work without change
across later Elasticsearch versions.
Elasticsearch plug-ins explicitly encode the version of Elasticsearch they work with, and the
plug-in script will refuse to install the lsh plug-in if its version does not match your
particular installation. If your Elasticsearch version is slightly different, you can try
unpacking the zip file, changing the version number to match your software, and then repacking
the zip file. Within the zip archive, the version number is stored in a configuration file
elasticsearch/plugin-descriptor.properties
The file format is fairly simple: edit the line
elasticsearch.version=7.17.4
The plugin may work with other nearby versions, but proceed at your own risk.
View File
+99
View File
@@ -0,0 +1,99 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
apply from: "$rootProject.projectDir/gradle/distributableGhidraExtension.gradle"
apply from: "$rootProject.projectDir/gradle/javaProject.gradle"
apply plugin: 'eclipse'
eclipse.project.name = 'Xtra BSimElasticPlugin'
// This module is very different from other Ghidra modules. It is creating a stand-alone jar
// file for an elastic database plugin. It is copying files from other modules into this module
// before building a jar file from the files in this module and the cherry-picked files from
// other modules (This is very brittle and will break if any of the files are renamed or moved.)
project.ext.includeExtensionInInstallation = true
apply plugin: 'java'
sourceSets {
elasticPlugin {
java {
srcDirs = [ 'src', 'srcdummy', 'build/genericSrc', 'build/utilitySrc', 'build/bsimSrc' ]
}
}
}
// this dependency block is needed for this code to compile in our eclipse environment. It is not needed
// for the gradle build
dependencies {
implementation project(':BSim')
}
libsDirName='ziplayout'
task copyGenericTask(type: Copy) {
from project(':Generic').file('src/main/java')
into 'build/genericSrc'
include 'generic/lsh/vector/*.java'
include 'generic/hash/SimpleCRC32.java'
include 'ghidra/util/xml/SpecXmlUtils.java'
}
task copyUtilityTask(type: Copy) {
from project(':Utility').file('src/main/java')
into 'build/utilitySrc'
include 'ghidra/xml/XmlPullParser.java'
include 'ghidra/xml/XmlElement.java'
}
task copyBSimTask(type: Copy) {
from project(':BSim').file('src/main/java')
into 'build/bsimSrc'
include 'ghidra/features/bsim/query/elastic/ElasticUtilities.java'
include 'ghidra/features/bsim/query/elastic/Base64Lite.java'
include 'ghidra/features/bsim/query/elastic/Base64VectorFactory.java'
}
task copyPropertiesFile(type: Copy) {
from 'contribZipExclude/plugin-descriptor.properties'
into 'build/ziplayout'
}
task elasticPluginJar(type: Jar) {
from sourceSets.elasticPlugin.output
archiveBaseName = 'lsh'
excludes = [
'**/org/apache',
'**/org/elasticsearch/common',
'**/org/elasticsearch/env',
'**/org/elasticsearch/index',
'**/org/elasticsearch/indices',
'**/org/elasticsearch/plugins',
'**/org/elasticsearch/script',
'**/org/elasticsearch/search'
]
}
task elasticPluginZip(type: Zip) {
from 'build/ziplayout'
archiveBaseName = 'lsh'
destinationDirectory = file("build/data")
}
compileElasticPluginJava.dependsOn copyGenericTask
compileElasticPluginJava.dependsOn copyUtilityTask
compileElasticPluginJava.dependsOn copyBSimTask
elasticPluginZip.dependsOn elasticPluginJar
elasticPluginZip.dependsOn copyPropertiesFile
jar.dependsOn elasticPluginZip
@@ -0,0 +1,6 @@
##VERSION: 2.0
##MODULE IP: Apache License 2.0
INSTALL.txt||GHIDRA||||END|
Module.manifest||GHIDRA||reviewed||END|
contribZipExclude/plugin-descriptor.properties||GHIDRA||||END|
extension.properties||GHIDRA||||END|
@@ -0,0 +1,6 @@
description=Feature Vector Plugin
version=1.0
name=lsh
classname=org.elasticsearch.plugin.analysis.lsh.AnalysisLSHPlugin
java.version=1.11
elasticsearch.version=8.8.1
+5
View File
@@ -0,0 +1,5 @@
name=BSimElasticPlugin
description=Elastic search backend for BSim.
author=Ghidra Team
createdOn=11/23/20
version=@extversion@
@@ -0,0 +1,134 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import java.io.IOException;
import java.util.*;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment;
import org.elasticsearch.index.IndexModule;
import org.elasticsearch.index.IndexSettings;
import org.elasticsearch.index.analysis.TokenizerFactory;
import org.elasticsearch.indices.analysis.AnalysisModule.AnalysisProvider;
import org.elasticsearch.plugins.*;
import org.elasticsearch.script.ScriptContext;
import org.elasticsearch.script.ScriptEngine;
import generic.lsh.vector.IDFLookup;
import generic.lsh.vector.WeightFactory;
import ghidra.features.bsim.query.elastic.Base64VectorFactory;
import ghidra.features.bsim.query.elastic.ElasticUtilities;
public class AnalysisLSHPlugin extends Plugin implements AnalysisPlugin, ScriptPlugin {
public static final String TOKENIZER_SETTINGS_BASE = "index.analysis.tokenizer.lsh_";
public static String settingString = "";
static private Map<String, Base64VectorFactory> vecFactoryMap = new HashMap<>();
private Map<String, AnalysisProvider<TokenizerFactory>> tokFactoryMap;
public class TokenizerFactoryProvider implements AnalysisProvider<TokenizerFactory> {
@Override
public TokenizerFactory get(IndexSettings indexSettings, Environment env, String name,
Settings settings) throws IOException {
// settingString = settingString + " : " + indexSettings.getIndex().getName() + '(' + name + ')';
return new LSHTokenizerFactory(indexSettings, env, name, settings);
}
}
public AnalysisLSHPlugin() {
TokenizerFactoryProvider provider = new TokenizerFactoryProvider();
tokFactoryMap = Collections.singletonMap("lsh_tokenizer", provider);
}
private static void setupVectorFactory(String name, String idfConfig, String lshWeights) {
WeightFactory weightFactory = new WeightFactory();
String[] split = lshWeights.split(" ");
double[] weightArray = new double[split.length];
for (int i = 0; i < weightArray.length; ++i) {
weightArray[i] = Double.parseDouble(split[i]);
}
weightFactory.set(weightArray);
IDFLookup idfLookup = new IDFLookup();
split = idfConfig.split(" ");
int[] intArray = new int[split.length];
for (int i = 0; i < intArray.length; ++i) {
intArray[i] = Integer.parseInt(split[i]);
}
idfLookup.set(intArray);
Base64VectorFactory vectorFactory = new Base64VectorFactory();
// Server-side factory is never used to generate signatures,
// so we don't need to specify settings
vectorFactory.set(weightFactory, idfLookup, 0);
vecFactoryMap.put(name, vectorFactory);
}
/**
* Entry point for Tokenizer and Script factories to grab the global vector factory
* @param name is the name of the tokenizer
* @return the vector factory used by the tokenizer
*/
public static Base64VectorFactory getVectorFactory(String name) {
return vecFactoryMap.get(name);
}
@Override
public void onIndexModule(IndexModule indexModule) {
super.onIndexModule(indexModule);
Settings settings = indexModule.getSettings();
String name = null;
// Look for the specific kind of tokenizer settings, within the global settings for the index
for (String key : settings.keySet()) {
if (key.startsWith(TOKENIZER_SETTINGS_BASE)) {
// We can have different settings for different indices, distinguished by this name
int pos = key.indexOf('.', TOKENIZER_SETTINGS_BASE.length() + 1);
if (pos > 0) {
name = key.substring(TOKENIZER_SETTINGS_BASE.length(), pos);
break;
}
}
}
if (name != null) {
String tokenizerName = "lsh_" + name;
if (getVectorFactory(tokenizerName) != null) {
return; // Factory already exists
}
settingString = settingString + " : onModule(" + name + ')';
// If we found LSH tokenizer settings, pull them out and construct an LSHVectorFactory with them
String baseKey = TOKENIZER_SETTINGS_BASE + name + '.';
String idfConfig = settings.get(baseKey + ElasticUtilities.IDF_CONFIG);
String lshWeights = settings.get(baseKey + ElasticUtilities.LSH_WEIGHTS);
if (idfConfig == null || lshWeights == null) {
return; // IDF_CONFIG and LSH_WEIGHTS settings must be present to proceed
}
setupVectorFactory(tokenizerName, idfConfig, lshWeights);
}
}
@Override
public ScriptEngine getScriptEngine(Settings settings, Collection<ScriptContext<?>> contexts) {
return new BSimScriptEngine();
}
@Override
public Map<String, AnalysisProvider<TokenizerFactory>> getTokenizers() {
return tokFactoryMap;
}
}
@@ -0,0 +1,54 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import java.util.*;
import org.elasticsearch.script.*;
public class BSimScriptEngine implements ScriptEngine {
private final static String ENGINE_NAME = "bsim_scripts";
@Override
public <FactoryType> FactoryType compile(String scriptName, String scriptSource,
ScriptContext<FactoryType> context, Map<String, String> params) {
if (context.equals(ScoreScript.CONTEXT) == false) {
throw new IllegalArgumentException(
getType() + "scripts cannot be used for context [" + context.name + "]");
}
if (VectorCompareScriptFactory.SCRIPT_NAME.equals(scriptSource)) {
ScoreScript.Factory factory = new VectorCompareScriptFactory();
return context.factoryClazz.cast(factory);
}
throw new IllegalArgumentException("Unknown script name " + scriptSource);
}
@Override
public void close() {
// Can free up resources
}
@Override
public Set<ScriptContext<?>> getSupportedContexts() {
return Collections.singleton(ScoreScript.CONTEXT);
}
@Override
public String getType() {
return ENGINE_NAME;
}
}
@@ -0,0 +1,293 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import generic.lsh.vector.HashEntry;
import ghidra.features.bsim.query.elastic.Base64Lite;
/**
* Class for calculating the bin ids on LSHVectors as part of the LSH indexing process
*
*/
public class LSHBinner {
private static final char[] hashSignTable = new char[512];
private static int VEC_SIZE_UPPER = 5; // Size above which to use FFT to calculate dotproduct family
private static int LSH_HASHBASE = 0xd7e6a299;
private static int HASH_MULTIPLIER = 1103515245;
private static int HASH_ADDEND = 12345;
public static class BytesRef {
public char[] buffer;
public BytesRef(int size) { buffer = new char[size]; }
}
private int k; // Number of bits per bin id
private int L; // Number of binnings
private double doubleBuffer[]; // Scratch space for dot-product calculation
private BytesRef tokenList[]; // Final token list used by lucene
static {
/**
* This is a precalculated table for generating dot-products with the random family of vectors directly
* The first vector r_0 is expressed as a hashing function on the dimension index and the other vectors
* are derived from r_0 using an FFT. The table is formed by precalculating the FFT on basis vectors in this table
*/
int i, j;
int[] arr = new int[16];
int hibit0ptr;
int hibit1ptr;
for (i = 0; i < 16; ++i) { /* For each 4-bit position */
hibit0ptr = i * 16;
hibit1ptr = (i + 16) * 16;
for (j = 0; j < 16; ++j)
arr[j] = 0;
arr[i] = 1;
hashFft16(arr);
for (j = 0; j < 16; ++j) {
if (arr[j] > 0) {
hashSignTable[hibit0ptr + j] = '+';
hashSignTable[hibit1ptr + j] = '-';
} else {
hashSignTable[hibit0ptr + j] = '-';
hashSignTable[hibit1ptr + j] = '+';
}
}
}
}
/**
* Raw Fast Fourier Transform on 16 wide integer array
* @param arr is the 16-long array
*/
private static void hashFft16(int[] arr) {
int x,y;
x = arr[0]; y = arr[8]; arr[0] = x + y; arr[8] = x - y;
x = arr[1]; y = arr[9]; arr[1] = x + y; arr[9] = x - y;
x = arr[2]; y = arr[10]; arr[2] = x + y; arr[10] = x - y;
x = arr[3]; y = arr[11]; arr[3] = x + y; arr[11] = x - y;
x = arr[4]; y = arr[12]; arr[4] = x + y; arr[12] = x - y;
x = arr[5]; y = arr[13]; arr[5] = x + y; arr[13] = x - y;
x = arr[6]; y = arr[14]; arr[6] = x + y; arr[14] = x - y;
x = arr[7]; y = arr[15]; arr[7] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[4]; arr[0] = x + y; arr[4] = x - y;
x = arr[1]; y = arr[5]; arr[1] = x + y; arr[5] = x - y;
x = arr[2]; y = arr[6]; arr[2] = x + y; arr[6] = x - y;
x = arr[3]; y = arr[7]; arr[3] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[12]; arr[8] = x + y; arr[12] = x - y;
x = arr[9]; y = arr[13]; arr[9] = x + y; arr[13] = x - y;
x = arr[10]; y = arr[14]; arr[10] = x + y; arr[14] = x - y;
x = arr[11]; y = arr[15]; arr[11] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[2]; arr[0] = x + y; arr[2] = x - y;
x = arr[1]; y = arr[3]; arr[1] = x + y; arr[3] = x - y;
x = arr[4]; y = arr[6]; arr[4] = x + y; arr[6] = x - y;
x = arr[5]; y = arr[7]; arr[5] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[10]; arr[8] = x + y; arr[10] = x - y;
x = arr[9]; y = arr[11]; arr[9] = x + y; arr[11] = x - y;
x = arr[12]; y = arr[14]; arr[12] = x + y; arr[14] = x - y;
x = arr[13]; y = arr[15]; arr[13] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[1]; arr[0] = x + y; arr[1] = x - y;
x = arr[2]; y = arr[3]; arr[2] = x + y; arr[3] = x - y;
x = arr[4]; y = arr[5]; arr[4] = x + y; arr[5] = x - y;
x = arr[6]; y = arr[7]; arr[6] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[9]; arr[8] = x + y; arr[9] = x - y;
x = arr[10]; y = arr[11]; arr[10] = x + y; arr[11] = x - y;
x = arr[12]; y = arr[13]; arr[12] = x + y; arr[13] = x - y;
x = arr[14]; y = arr[15]; arr[14] = x + y; arr[15] = x - y;
}
/**
* Raw Fast Fourier Transform on 16 wide array of doubles
* @param arr is the 16-long array
*/
private static void hashFft16(double[] arr) {
double x,y;
x = arr[0]; y = arr[8]; arr[0] = x + y; arr[8] = x - y;
x = arr[1]; y = arr[9]; arr[1] = x + y; arr[9] = x - y;
x = arr[2]; y = arr[10]; arr[2] = x + y; arr[10] = x - y;
x = arr[3]; y = arr[11]; arr[3] = x + y; arr[11] = x - y;
x = arr[4]; y = arr[12]; arr[4] = x + y; arr[12] = x - y;
x = arr[5]; y = arr[13]; arr[5] = x + y; arr[13] = x - y;
x = arr[6]; y = arr[14]; arr[6] = x + y; arr[14] = x - y;
x = arr[7]; y = arr[15]; arr[7] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[4]; arr[0] = x + y; arr[4] = x - y;
x = arr[1]; y = arr[5]; arr[1] = x + y; arr[5] = x - y;
x = arr[2]; y = arr[6]; arr[2] = x + y; arr[6] = x - y;
x = arr[3]; y = arr[7]; arr[3] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[12]; arr[8] = x + y; arr[12] = x - y;
x = arr[9]; y = arr[13]; arr[9] = x + y; arr[13] = x - y;
x = arr[10]; y = arr[14]; arr[10] = x + y; arr[14] = x - y;
x = arr[11]; y = arr[15]; arr[11] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[2]; arr[0] = x + y; arr[2] = x - y;
x = arr[1]; y = arr[3]; arr[1] = x + y; arr[3] = x - y;
x = arr[4]; y = arr[6]; arr[4] = x + y; arr[6] = x - y;
x = arr[5]; y = arr[7]; arr[5] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[10]; arr[8] = x + y; arr[10] = x - y;
x = arr[9]; y = arr[11]; arr[9] = x + y; arr[11] = x - y;
x = arr[12]; y = arr[14]; arr[12] = x + y; arr[14] = x - y;
x = arr[13]; y = arr[15]; arr[13] = x + y; arr[15] = x - y;
x = arr[0]; y = arr[1]; arr[0] = x + y; arr[1] = x - y;
x = arr[2]; y = arr[3]; arr[2] = x + y; arr[3] = x - y;
x = arr[4]; y = arr[5]; arr[4] = x + y; arr[5] = x - y;
x = arr[6]; y = arr[7]; arr[6] = x + y; arr[7] = x - y;
x = arr[8]; y = arr[9]; arr[8] = x + y; arr[9] = x - y;
x = arr[10]; y = arr[11]; arr[10] = x + y; arr[11] = x - y;
x = arr[12]; y = arr[13]; arr[12] = x + y; arr[13] = x - y;
x = arr[14]; y = arr[15]; arr[14] = x + y; arr[15] = x - y;
}
public LSHBinner() {
doubleBuffer = new double[16];
k = -1;
L = -1;
tokenList = null;
}
public void setKandL(int k,int L) {
this.k = k;
this.L = L;
int numBits = 1;
while( (1 << numBits) <= L )
numBits += 1;
numBits += k;
int numChar = numBits / 6;
if ((numBits % 6)!= 0)
numChar += 1;
tokenList = new BytesRef[L];
for(int i=0;i<L;++i) {
tokenList[i] = new BytesRef(numChar);
}
}
public BytesRef[] getTokenList() {
return tokenList;
}
/**
* Generate a dot product of the hash vector in -vec- with a random family of 16 vectors, { r }
* r_0 is a randomly generated set of +1 -1 coefficients across all the dimensions (indexed by uint32 vec[i].hash)
* The coefficient is calculated as a hashing function from the seed -hashcur- and the index (vec[i].hash),
* so it should be balanced between +1 and -1.
* All the other vectors are generated from an FFT of r_0. This allows the dotproduct with vec to be calculated
* using an FFT if -vec- has many non-zero coefficients. If -vec- has only a few non-zero coefficients,
* the dotproduct if calculated with each vector in the family directly for better efficiency.
* The resulting dotproducts are converted into a 16-long bitvector based on the sign of the dotproduct and
* placed in -bucket-
* @param bucket is the (possibly partially filled) accumulator for dotproduct bits
* @param vec is the HashEntry vector to calculate the dot-products on
* @param hashcur is the index of the hash subfamily to representing r_0
* @param res is space (a 16-long double array) for the in-place FFT
* @return the bucket with new accumulated dot-product bits
*/
private int hash16DotProduct(int bucket,HashEntry[] vec,int hashcur)
{
int i, j;
int rowNum;
int signPtr;
for (i = 0; i < 16; ++i)
doubleBuffer[i] = 0.0; // Initialize the dotproduct results to zero
if (vec.length < VEC_SIZE_UPPER) { // If there are a small number of non-zero coefficients in -vec-
for (i = 0; i < vec.length; ++i) {
rowNum = vec[i].getHash() ^ hashcur; // Calculate the rest of the r_0 hashing function
rowNum = (rowNum * HASH_MULTIPLIER) + HASH_ADDEND;
rowNum = (rowNum >>> 24) & 0x1f;
signPtr = rowNum * 16;
for (j = 0; j < 16; ++j) { // Based on the precalculated coeff table calculate this portion of dotproduct
if (hashSignTable[signPtr + j] == '+')
doubleBuffer[j] += vec[i].getCoeff(); // Dot product with +1 // coeff
else
doubleBuffer[j] -= vec[i].getCoeff(); // Dot product with -1 // coeff
}
}
}
else { // If we have many non-zero coefficients in -vec-
for (i = 0; i < vec.length; ++i) {
rowNum = vec[i].getHash() ^ hashcur; // Calculate the rest of the r_0 hashing function
rowNum = (rowNum * HASH_MULTIPLIER) + HASH_ADDEND;
rowNum = (rowNum >>> 24) & 0x1f;
if (rowNum < 0x10) // Set-up for the FFT
doubleBuffer[rowNum] += vec[i].getCoeff();
else
doubleBuffer[rowNum & 0xf] -= vec[i].getCoeff();
}
hashFft16(doubleBuffer); // Calculate the remaining dot-products be performing FFT
}
for (i = 0; i < 16; ++i) { // Convert the dot-product results to a bit-vector
bucket <<= 1;
if (doubleBuffer[i] > 0.0)
bucket |= 1;
}
return bucket;
}
public void generateBinIds(HashEntry[] vec)
{
int bucket = 0;
int bucketcnt = 0;
int i,bitsleft;
int curid;
int mask,val;
int hashbase = LSH_HASHBASE;
for (i = 0; i < L; ++i) {
curid = i; // Tack-on bits that indicate the particular table this bin id belongs to
bitsleft = k;
do {
if (bucketcnt == 0) {
hashbase = (hashbase * HASH_MULTIPLIER) + HASH_ADDEND;
bucket = hash16DotProduct(bucket, vec, hashbase);
bucketcnt += 16;
}
if (bucketcnt >= bitsleft) {
curid <<= bitsleft;
mask = 1;
mask = (mask << bitsleft) - 1;
val = bucket >>> (bucketcnt - bitsleft);
curid |= (val & mask);
bucketcnt -= bitsleft;
bitsleft = 0;
} else {
curid <<= bucketcnt;
mask = 1;
mask = (mask << bucketcnt) - 1;
curid |= (bucket & mask);
bitsleft -= bucketcnt;
bucketcnt = 0;
}
} while (bitsleft > 0);
char[] token = tokenList[i].buffer;
for(int j=0;j<token.length;++j) {
token[j] = Base64Lite.encode[curid & 0x3f]; // encode 6 bits
curid >>= 6; // move to next 6 bits
}
}
}
}
@@ -0,0 +1,68 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import java.io.IOException;
import org.apache.lucene.analysis.Tokenizer;
import org.apache.lucene.analysis.tokenattributes.CharTermAttribute;
import org.elasticsearch.plugin.analysis.lsh.LSHBinner.BytesRef;
import generic.lsh.vector.LSHVector;
import ghidra.features.bsim.query.elastic.Base64VectorFactory;
public class LSHTokenizer extends Tokenizer {
private final CharTermAttribute bytesAtt = addAttribute(CharTermAttribute.class);
private BytesRef[] tokens;
private int pos; // Number of terms/tokens returned so far
private Base64VectorFactory vectorFactory;
private LSHBinner binner;
private char[] vecBuffer;
public LSHTokenizer(int k,int L,Base64VectorFactory vFactory) {
super(DEFAULT_TOKEN_ATTRIBUTE_FACTORY);
vectorFactory = vFactory;
binner = new LSHBinner();
binner.setKandL(k, L);
pos = -1;
vecBuffer = Base64VectorFactory.allocateBuffer();
}
@Override
public boolean incrementToken() throws IOException {
clearAttributes();
if (pos < 0) {
LSHVector vector = vectorFactory.restoreVectorFromBase64(input,vecBuffer);
// AnalysisLSHPlugin.settingString = AnalysisLSHPlugin.settingString + " : " + Long.toHexString(vector.calcUniqueHash());
binner.generateBinIds(vector.getEntries());
tokens = binner.getTokenList();
pos = 0;
}
if (pos < tokens.length) {
char[] buffer = tokens[pos].buffer;
bytesAtt.copyBuffer(buffer,0,buffer.length);
pos += 1;
return true;
}
return false;
}
@Override
public void reset() throws IOException {
super.reset();
pos = -1;
}
}
@@ -0,0 +1,44 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import org.apache.lucene.analysis.Tokenizer;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment;
import org.elasticsearch.index.IndexSettings;
import org.elasticsearch.index.analysis.AbstractTokenizerFactory;
import ghidra.features.bsim.query.elastic.Base64VectorFactory;
import ghidra.features.bsim.query.elastic.ElasticUtilities;
public class LSHTokenizerFactory extends AbstractTokenizerFactory {
private Base64VectorFactory vectorFactory;
private int k;
private int L;
public LSHTokenizerFactory(IndexSettings indexSettings, Environment environment, String name, Settings settings) {
super(indexSettings, settings, name);
k = settings.getAsInt(ElasticUtilities.K_SETTING, -1);
L = settings.getAsInt(ElasticUtilities.L_SETTING, -1);
vectorFactory = AnalysisLSHPlugin.getVectorFactory(name);
}
@Override
public Tokenizer create() {
return new LSHTokenizer(k,L,vectorFactory);
}
}
@@ -0,0 +1,147 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugin.analysis.lsh;
import java.io.*;
import java.util.Map;
import org.apache.lucene.document.Document;
import org.apache.lucene.util.BytesRef;
import org.elasticsearch.script.*;
import org.elasticsearch.script.ScoreScript.LeafFactory;
import org.elasticsearch.search.lookup.SearchLookup;
import generic.lsh.vector.LSHVector;
import generic.lsh.vector.VectorCompare;
import ghidra.features.bsim.query.elastic.Base64VectorFactory;
public class VectorCompareScriptFactory implements ScoreScript.Factory {
public final static String SCRIPT_NAME = "lsh_compare";
public final static String FEATURES_NAME = "{\"features\":\"";
@Override
public boolean isResultDeterministic() {
return true;
}
@Override
public LeafFactory newFactory(Map<String, Object> params, SearchLookup lookup) {
return new VectorCompareLeafFactory(params, lookup);
}
private static class VectorCompareLeafFactory implements LeafFactory {
private final Map<String, Object> params;
private final SearchLookup lookup;
private LSHVector baseVector; // Vector being compared to everything
private final double simthresh; // Similarity threshold
private final double sigthresh; // Significance threshold
private final Base64VectorFactory vectorFactory; // Factory used for this particular query
private VectorCompareLeafFactory(Map<String, Object> params, SearchLookup lookup) {
this.params = params;
this.lookup = lookup;
vectorFactory = AnalysisLSHPlugin.getVectorFactory((String) params.get("indexname"));
simthresh = (Double) params.get("simthresh");
sigthresh = (Double) params.get("sigthresh");
StringReader reader = new StringReader((String) params.get("vector"));
try {
baseVector = vectorFactory.restoreVectorFromBase64(reader,
Base64VectorFactory.allocateBuffer());
}
catch (IOException e) {
baseVector = null;
}
}
@Override
public boolean needs_score() {
return false;
}
private static int scanForFeatures(byte[] buffer, int offset) throws IOException {
int i = 0;
while (i < FEATURES_NAME.length()) {
char curChar = FEATURES_NAME.charAt(i);
int val = buffer[offset];
if (val == curChar) {
i += 1;
offset += 1;
}
else if (val == ' ' || val == '\t') {
offset += 1;
}
else {
throw new IOException("Document is missing \"features\"");
}
}
return offset;
}
private static int scanForLength(BytesRef byteRef, int startOffset) throws IOException {
int finalLength = 0;
int maxLength = byteRef.length - (startOffset - byteRef.offset);
while (finalLength < maxLength) {
if (byteRef.bytes[finalLength + startOffset] == '\"') {
break;
}
finalLength += 1;
}
if (finalLength == byteRef.length) {
throw new IOException("Document does not contain complete \"features\"");
}
return finalLength;
}
@Override
public ScoreScript newInstance(DocReader docReader) throws IOException {
return new ScoreScript(params, lookup, docReader) {
@Override
public double execute(ExplanationHolder explanation) {
try {
DocValuesDocReader dvReader = (DocValuesDocReader) docReader;
Document document =
dvReader.getLeafReaderContext().reader().document(_getDocId());
BytesRef byteRef = document.getField("_source").binaryValue();
int valOffset = scanForFeatures(byteRef.bytes, byteRef.offset);
int finalLength = scanForLength(byteRef, valOffset);
InputStream inputStream =
new ByteArrayInputStream(byteRef.bytes, valOffset, finalLength);
Reader reader = new InputStreamReader(inputStream);
// Should be sharing the VectorCompare between different calls
// but apparently this routine needs to be thread safe, so we allocate it per call
VectorCompare vectorCompare = new VectorCompare();
LSHVector curVec = vectorFactory.restoreVectorFromBase64(reader,
Base64VectorFactory.allocateBuffer());
double sim = baseVector.compare(curVec, vectorCompare);
if (sim <= simthresh) {
return 0.0;
}
double sig = vectorFactory.calculateSignificance(vectorCompare);
if (sig <= sigthresh) {
return 0.0;
}
return sim;
}
catch (IOException e) {
return 0.0;
}
}
};
}
}
}
@@ -0,0 +1,29 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.analysis;
import java.io.Closeable;
import java.io.IOException;
import org.apache.lucene.util.AttributeFactory;
import org.apache.lucene.util.AttributeSource;
public abstract class TokenStream extends AttributeSource implements Closeable {
public static final AttributeFactory DEFAULT_TOKEN_ATTRIBUTE_FACTORY = null;
public abstract boolean incrementToken() throws IOException;
}
@@ -0,0 +1,38 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.analysis;
import java.io.IOException;
import java.io.Reader;
import org.apache.lucene.util.AttributeFactory;
public abstract class Tokenizer extends TokenStream {
protected Reader input;
protected Tokenizer(AttributeFactory factory) {
}
@Override
public void close() throws IOException {
}
public void reset() throws IOException {
}
}
@@ -0,0 +1,25 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.analysis.tokenattributes;
import org.apache.lucene.util.Attribute;
public interface CharTermAttribute extends Attribute, CharSequence, Appendable {
public void copyBuffer(char[] buffer, int offset, int length);
}
@@ -0,0 +1,26 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.document;
import org.apache.lucene.index.IndexableField;
public class Document {
public final IndexableField getField(String name) {
return null;
}
}
@@ -0,0 +1,27 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.index;
import java.io.Closeable;
import java.io.IOException;
import org.apache.lucene.document.Document;
public abstract class IndexReader implements Closeable {
public final Document document(int docID) throws IOException {
return null;
}
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.index;
public abstract class IndexReaderContext {
public abstract IndexReader reader();
}
@@ -0,0 +1,23 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.index;
import org.apache.lucene.util.BytesRef;
public interface IndexableField {
public BytesRef binaryValue();
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.index;
public abstract class LeafReader extends IndexReader {
}
@@ -0,0 +1,24 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.index;
public final class LeafReaderContext extends IndexReaderContext {
@Override
public LeafReader reader() {
return null;
}
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.util;
public interface Attribute {
}
@@ -0,0 +1,20 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.util;
public abstract class AttributeFactory {
}
@@ -0,0 +1,27 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.util;
public class AttributeSource {
public final <T extends Attribute> T addAttribute(Class<T> attClass) {
return null;
}
public final void clearAttributes() {
}
}
@@ -0,0 +1,23 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for lucene class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.lucene.util;
public class BytesRef {
public byte[] bytes;
public int length;
public int offset;
}
@@ -0,0 +1,34 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.common.settings;
import java.util.Set;
public class Settings {
public Integer getAsInt(String setting, Integer defaultValue) {
return null;
}
public String get(String setting) {
return null;
}
public Set<String> keySet() {
return null;
}
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.env;
public class Environment {
}
@@ -0,0 +1,26 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.index;
import org.elasticsearch.common.settings.Settings;
public class IndexModule {
public Settings getSettings() {
return null;
}
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.index;
public final class IndexSettings {
}
@@ -0,0 +1,27 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.index.analysis;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.index.IndexSettings;
public abstract class AbstractTokenizerFactory implements TokenizerFactory {
public AbstractTokenizerFactory(IndexSettings indexSettings, Settings settings, String name) {
}
}
@@ -0,0 +1,24 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.index.analysis;
import org.apache.lucene.analysis.Tokenizer;
public interface TokenizerFactory {
Tokenizer create();
}
@@ -0,0 +1,31 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.indices.analysis;
import java.io.IOException;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.env.Environment;
import org.elasticsearch.index.IndexSettings;
public class AnalysisModule {
public interface AnalysisProvider<T> {
T get(IndexSettings indexSettings, Environment environment, String name, Settings settings)
throws IOException;
}
}
@@ -0,0 +1,27 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugins;
import java.util.Map;
import org.elasticsearch.index.analysis.TokenizerFactory;
import org.elasticsearch.indices.analysis.AnalysisModule.AnalysisProvider;
public interface AnalysisPlugin {
Map<String, AnalysisProvider<TokenizerFactory>> getTokenizers();
}
@@ -0,0 +1,32 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugins;
import java.io.Closeable;
import java.io.IOException;
import org.elasticsearch.index.IndexModule;
public abstract class Plugin implements Closeable {
public void onIndexModule(IndexModule indexModule) {
}
@Override
public void close() throws IOException {
}
}
@@ -0,0 +1,28 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.plugins;
import java.util.Collection;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.script.ScriptContext;
import org.elasticsearch.script.ScriptEngine;
public interface ScriptPlugin {
ScriptEngine getScriptEngine(Settings settings, Collection<ScriptContext<?>> contexts);
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
public interface DocReader {
}
@@ -0,0 +1,28 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
import org.apache.lucene.index.LeafReaderContext;
public class DocValuesDocReader implements DocReader, LeafReaderContextSupplier {
@Override
public LeafReaderContext getLeafReaderContext() {
return null;
}
}
@@ -0,0 +1,23 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
import org.apache.lucene.index.LeafReaderContext;
public interface LeafReaderContextSupplier {
LeafReaderContext getLeafReaderContext();
}
@@ -0,0 +1,50 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
import java.io.IOException;
import java.util.Map;
import org.elasticsearch.search.lookup.SearchLookup;
public abstract class ScoreScript {
public ScoreScript(Map<String, Object> params, SearchLookup searchLookup, DocReader docReader) {
}
public static class ExplanationHolder {
}
public static final ScriptContext<ScoreScript.Factory> CONTEXT = null;
public interface Factory extends ScriptFactory {
LeafFactory newFactory(Map<String, Object> params, SearchLookup lookup);
}
public interface LeafFactory {
boolean needs_score();
ScoreScript newInstance(DocReader reader) throws IOException;
}
public int _getDocId() {
return 0;
}
public abstract double execute(ExplanationHolder explanation);
}
@@ -0,0 +1,22 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
public final class ScriptContext<T> {
public final String name = null;
public final Class<T> factoryClazz = null;
}
@@ -0,0 +1,30 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch interface
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
import java.io.Closeable;
import java.util.Map;
import java.util.Set;
public interface ScriptEngine extends Closeable {
String getType();
<FactoryType> FactoryType compile(String name, String code, ScriptContext<FactoryType> context,
Map<String, String> params);
Set<ScriptContext<?>> getSupportedContexts();
}
@@ -0,0 +1,22 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.script;
public interface ScriptFactory {
boolean isResultDeterministic();
}
@@ -0,0 +1,21 @@
/* ###
* IP: GHIDRA
* NOTE: Dummy placeholder for elasticsearch class
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.elasticsearch.search.lookup;
public class SearchLookup {
}
+9
View File
@@ -0,0 +1,9 @@
##MODULE IP: Oxygen Icons - LGPL 3.0
MODULE FILE LICENSE: postgresql-15.3.tar.gz Postgresql License
MODULE FILE LICENSE: lib/postgresql-42.6.0.jar PostgresqlJDBC License
MODULE FILE LICENSE: lib/json-simple-1.1.1.jar Apache License 2.0
MODULE FILE LICENSE: lib/commons-dbcp2-2.9.0.jar Apache License 2.0
MODULE FILE LICENSE: lib/commons-pool2-2.11.1.jar Apache License 2.0
MODULE FILE LICENSE: lib/commons-logging-1.2.jar Apache License 2.0
MODULE FILE LICENSE: lib/log4j-jcl-2.16.0.jar Apache License 2.0
MODULE FILE LICENSE: lib/h2-2.2.220.jar H2 Mozilla License 2.0
+197
View File
@@ -0,0 +1,197 @@
/* ###
* IP: GHIDRA
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
apply from: "$rootProject.projectDir/gradle/distributableGhidraModule.gradle"
apply from: "$rootProject.projectDir/gradle/javaProject.gradle"
apply from: "$rootProject.projectDir/gradle/javaTestProject.gradle"
apply from: "$rootProject.projectDir/gradle/nativeProject.gradle"
apply from: "$rootProject.projectDir/gradle/helpProject.gradle"
apply plugin: 'eclipse'
eclipse.project.name = 'Features BSim'
import java.nio.file.Files
import org.gradle.util.GUtil
// NOTE: fetchDependencies.gradle must be updated if postgresql version changes
def postgresql_distro = "postgresql-15.3.tar.gz"
dependencies {
api project(":Decompiler")
api project(":CodeCompare")
api "org.postgresql:postgresql:42.6.0"
api "org.json.simple:json-simple:1.1.1"
api "org.apache.commons:commons-dbcp2:2.9.0"
api "org.apache.commons:commons-pool2:2.11.1"
api "org.apache.commons:commons-logging:1.2"
api "org.apache.logging.log4j:log4j-jcl:2.16.0"
api "com.h2database:h2:2.2.220"
}
// Copy postgresql source distro, lshvector plugin source, and make-postgres.sh
// into common zip to allow for a rebuild of the postgres server if needed
rootProject.assembleDistribution {
String postgresqlDepsFile = "${DEPS_DIR}/BSim/${postgresql_distro}"
String postgresqlBinRepoFile = "${BIN_REPO}/Ghidra/Features/BSim/${postgresql_distro}"
def postgresqlFile = file(postgresqlDepsFile).exists() ? postgresqlDepsFile : postgresqlBinRepoFile
into (getZipPath(this.project)) {
from file("make-postgres.sh")
}
into (getZipPath(this.project)) {
from file(postgresqlFile)
}
into (getZipPath(this.project) + "/src/lshvector") {
from files("src/lshvector")
}
}
// Relative to the 'workingDir' Exec task property.
def installPoint = "../help/help"
/**
* Build the pdf docs for BSim and place into the '$installPoint' directory.
* A build (ex: 'gradle buildLocalTSSI_Release') will place the pdf in the distribution.
* There is an associated, auto-generated clean task.
**/
task buildBSimHelpPdf(type: Exec) {
workingDir 'src/main/doc'
def buildDir = "../../../build/BSimDocumentationPdf"
// Gradle will provide a cleanBuildBSimDocumentationPdf task that will remove these
// declared outputs.
outputs.dir "$workingDir/$buildDir"
outputs.file "$workingDir/$buildDir/bsim.pdf"
// 'which' returns the number of failed arguments
// Using the 'which' command first will allow the task to fail if the required
// executables are not installed.
//
// The bash commands end with "2>&1" to redirect stderr to stdout and have all
// messages print in sequence
//
// 'commandLine' takes one command, so wrap multiple commands in bash.
commandLine 'bash', '-e', '-c', """
echo '** Checking if required executables are installed. **'
which xsltproc
which fop
echo '** Preparing for xsltproc **'
mkdir -p $buildDir/images
cp $installPoint/topics/BSimDatabasePlugin/images/*.png $buildDir/images
echo '** Building bsim.fo **'
xsltproc --output $buildDir/bsim_withscaling.xml --stringparam profile.condition "withscaling" commonprofile.xsl bsim.xml 2>&1
xsltproc --output $buildDir/bsim.fo focustom.xsl $buildDir/bsim_withscaling.xml 2>&1
echo '** Building bsim.pdf **'
fop $buildDir/bsim.fo $buildDir/bsim.pdf 2>&1
echo '** Done. **'
"""
// Allows doLast block regardless of exit value.
ignoreExitValue true
// Store the output instead of printing to the console.
standardOutput = new ByteArrayOutputStream()
ext.output = { standardOutput.toString() }
ext.errorOutput = { standardOutput.toString() }
// Check the OS before executing command.
doFirst {
if (!getCurrentPlatformName().startsWith("linux")) {
throw new TaskExecutionException( it, new Exception("The '$it.name' task only works on Linux."))
}
}
// Print the output of the commands and check the return value.
doLast {
println output()
if (execResult.exitValue) {
logger.error("$it.name: An error occurred. Here is the output:\n" + output())
throw new TaskExecutionException( it, new Exception("'$it.name': The command: '${commandLine.join(' ')}'" +
" task \nfailed with exit code $execResult.exitValue; see task output for details."))
}
}
}
/**
* Build the html docs for BSim and place into the '$installPoint' directory.
* A build (ex: 'gradle buildLocalTSSI_Release') will place the html files in the distribution.
**/
task buildBSimHelpHtml(type: Exec) {
workingDir 'src/main/doc'
def buildDir = "../../../build/html"
// 'which' returns the number of failed arguments
// Using the 'which' command first will allow the task to fail if the required
// executables are not installed.
//
// The bash commands end with "2>&1" to redirect stderr to stdout and have all
// messages print in sequence
//
// 'commandLine' takes one command, so wrap multiple commands in bash.
commandLine 'bash', '-e', '-c', """
echo '** Checking if required executables are installed. **'
which xsltproc
which sed
echo '** Removing older html files installed under '$installPoint' **'
rm -f $installPoint/topics/BSimDatabasePlugin/*.html
echo '** Building html files **'
xsltproc --output $buildDir/bsim_noscaling.xml --stringparam profile.condition "noscaling" commonprofile.xsl bsim.xml 2>&1
xsltproc --stringparam base.dir ${installPoint}/topics/BSimDatabasePlugin/ htmlcustom.xsl $buildDir/bsim_noscaling.xml 2>&1
sed -i -e '/DefaultStyle.css/ { p; sQhref=".*"Qhref="../../shared/languages.css"Q; }' ${installPoint}/topics/BSimDatabasePlugin/*.html
rm $installPoint/topics/BSimDatabasePlugin/index.html
echo '** Done. **'
"""
// Allows doLast block regardless of exit value.
ignoreExitValue true
// Store the output instead of printing to the console.
standardOutput = new ByteArrayOutputStream()
ext.output = { standardOutput.toString() }
ext.errorOutput = { standardOutput.toString() }
// Check the OS before executing command.
doFirst {
if (!getCurrentPlatformName().startsWith("linux")) {
throw new TaskExecutionException( it, new Exception("The '$it.name' task only works on Linux."))
}
}
// Print the output of the commands and check the return value.
doLast {
println output()
if (execResult.exitValue) {
logger.error("$it.name: An error occurred. Here is the output:\n" + output())
throw new TaskExecutionException( it, new Exception("'$it.name': The command: '${commandLine.join(' ')}'" +
" task \nfailed with exit code $execResult.exitValue; see task output for details."))
}
}
}
+51
View File
@@ -0,0 +1,51 @@
##VERSION: 2.0
##MODULE IP: Apache License 2.0
##MODULE IP: Creative Commons Attribution 2.5
##MODULE IP: Crystal Clear Icons - LGPL 2.1
##MODULE IP: FAMFAMFAM Icons - CC 2.5
##MODULE IP: H2 Mozilla License 2.0
##MODULE IP: LGPL 2.1
##MODULE IP: LGPL 3.0
##MODULE IP: Oxygen Icons - LGPL 3.0
##MODULE IP: Postgresql License
##MODULE IP: PostgresqlJDBC License
##MODULE IP: Public Domain
Module.manifest||GHIDRA||||END|
data/bsim.theme.properties||GHIDRA||||END|
data/large_32.xml||GHIDRA||||END|
data/lshweights_32.xml||GHIDRA|||Signature data|END|
data/lshweights_64.xml||GHIDRA|||Signature data|END|
data/lshweights_64_32.xml||GHIDRA|||Signature data|END|
data/lshweights_cpool.xml||GHIDRA||||END|
data/lshweights_nosize.xml||GHIDRA||||END|
data/medium_32.xml||GHIDRA||||END|
data/medium_64.xml||GHIDRA||||END|
data/medium_cpool.xml||GHIDRA||||END|
data/medium_nosize.xml||GHIDRA||||END|
data/serverconfig.xml||GHIDRA||||END|
src/lshvector/Makefile.lshvector||GHIDRA||||END|
src/lshvector/lshvector--1.0.sql||GHIDRA||||END|
src/lshvector/lshvector.control||GHIDRA||||END|
src/main/help/help/TOC_Source.xml||GHIDRA||||END|
src/main/help/help/topics/BSim/BSimOverview.html||GHIDRA||||END|
src/main/help/help/topics/BSim/CommandLineReference.html||GHIDRA||||END|
src/main/help/help/topics/BSim/DatabaseConfiguration.html||GHIDRA||||END|
src/main/help/help/topics/BSim/FeatureWeight.html||GHIDRA||||END|
src/main/help/help/topics/BSim/IngestProcess.html||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/BSimSearch.html||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/AddServerDialog.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/ApplyResultsPanel.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/BSimOverviewDialog.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/BSimOverviewResults.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/BSimResultsProvider.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/BSimSearchDialog.png||GHIDRA||||END|
src/main/help/help/topics/BSimSearchPlugin/images/ManageServersDialog.png||GHIDRA||||END|
src/main/resources/bsim.log4j.xml||GHIDRA||||END|
src/main/resources/images/checkmark_yellow.gif||GHIDRA||||END|
src/main/resources/images/flag_green.png||FAMFAMFAM Icons - CC 2.5|||famfamfam silk icon set|END|
src/main/resources/images/preferences-desktop-user-password.png||Oxygen Icons - LGPL 3.0|||Oxygen icon theme (dual license; LGPL or CC-SA-3.0)|END|
src/main/resources/images/preferences-web-browser-shortcuts-32.png||Oxygen Icons - LGPL 3.0|||Oxygen icon theme (dual license; LGPL or CC-SA-3.0)|END|
src/main/resources/images/preferences-web-browser-shortcuts.png||LGPL 3.0|||oxygen|END|
src/main/resources/images/view_top_bottom.png||Crystal Clear Icons - LGPL 2.1||||END|
src/main/resources/log4j-appender-console.xml||GHIDRA||||END|
src/main/resources/log4j-appender-rolling-file.xml||GHIDRA||||END|
@@ -0,0 +1,17 @@
[Defaults]
icon.bsim.query.dialog.provider = preferences-web-browser-shortcuts.png
icon.bsim.change.password = preferences-desktop-user-password.png
icon.bsim.table.split = view_top_bottom.png
icon.bsim.results.status.name.applied = checkmark_green.gif
icon.bsim.results.status.signature.applied = EMPTY_ICON {checkmark_green.gif[move(-2,-1)]} {checkmark_green.gif [move(4,0)]}
icon.bsim.results.status.matches = flag_green.png
icon.bsim.results.status.ignored = checkmark_yellow.gif
icon.bsim.functions.table = FunctionScope.gif
[Dark Defaults]

Some files were not shown because too many files have changed in this diff Show More