Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade JDK toolchain/target version to 17 and add GraalJS #1207

Draft
wants to merge 20 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
b00c63b
Upgrade JDK toolchain and target version for most packages to 17 to g…
gregschohn Dec 20, 2024
56b9bd0
Add a pure JavascriptTransformer that can run initialization and per-…
gregschohn Dec 22, 2024
4816155
Merge branch 'main' into liquidJsTransforms
AndreKurait Jan 8, 2025
7b4cecd
Merge remote-tracking branch 'upstream/main' into liquidJsTransforms
AndreKurait Jan 9, 2025
ed2a0a8
Update javascript transformations to handle Object IJsonTransformer c…
AndreKurait Jan 9, 2025
7e12bac
Remove LiquidJs due to performance slowdown over Javascript
AndreKurait Jan 9, 2025
1fe744a
Create JsonJSTransformerProvider for custom scripts
AndreKurait Jan 9, 2025
efc4df2
Update TypeMappingSanitizationTransformerProvider to share logic with…
AndreKurait Jan 9, 2025
397d72e
Revert vscode settings
AndreKurait Jan 9, 2025
738cf2b
Update project (except capture proxy and dependencies) to Java 17 for…
AndreKurait Jan 10, 2025
4091797
Enable error prone for JDK 17
AndreKurait Jan 10, 2025
ae06216
re-enable task tree
AndreKurait Jan 10, 2025
4515ed3
Update readme and transformation type mapping parameter input
AndreKurait Jan 10, 2025
bc6e289
Fix javascript typemapping sanitization
AndreKurait Jan 10, 2025
a1c3a6a
Restrict polyglot host access
AndreKurait Jan 11, 2025
7be2a63
Refactor javascript transformers with graaljs community, security set…
AndreKurait Jan 14, 2025
a3fd9d7
Update Java Version docs
AndreKurait Jan 14, 2025
313d768
Add comment on task tree
AndreKurait Jan 14, 2025
9a48175
Retain compatibility for spotless with jdk 11
AndreKurait Jan 14, 2025
5b6fc2a
Support arbritrary static bindings object
AndreKurait Jan 14, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

env:
python-version: '3.11'
java-version: '11'
java-version: '17'
gradle-version: '8.0.2'
node-version: '18.x'

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/release-drafter.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:
tags:
- "*"
env:
java-version: '11'
java-version: '17'
gradle-version: '8.0.2'

permissions:
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/sonar-qube.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ on:
pull_request:

env:
java-version: '11'
java-version: '17'
gradle-version: '8.0.2'

jobs:
Expand All @@ -32,7 +32,7 @@ jobs:
with:
distribution: 'corretto'
java-version: |
11
17
- name: Cache SonarQube Scanner
uses: actions/cache@v3
with:
Expand Down
3 changes: 0 additions & 3 deletions CreateSnapshot/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,6 @@ plugins {
id 'org.opensearch.migrations.java-application-conventions'
}

java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11

dependencies {
implementation project(':coreUtilities')
implementation project(":RFS")
Expand Down
2 changes: 1 addition & 1 deletion DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

## Prerequisites

- Java Development Kit (JDK) 11
- Java Development Kit (JDK) 11-17
- Gradle 8
- Python3
- Docker and Docker Compose (for local deployment)
Expand Down
3 changes: 0 additions & 3 deletions DataGenerator/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@ plugins {
id 'io.freefair.lombok'
}

java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11

dependencies {
implementation project(":coreUtilities")
implementation project(":RFS")
Expand Down
3 changes: 0 additions & 3 deletions DocumentsFromSnapshotMigration/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,6 @@ import com.bmuschko.gradle.docker.tasks.image.DockerBuildImage
import groovy.transform.Canonical
import org.opensearch.migrations.common.CommonUtils

java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11

@Canonical
class DockerServiceProps {
String projectName = ""
Expand Down
2 changes: 1 addition & 1 deletion DocumentsFromSnapshotMigration/docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Using same base image as other Java containers in this repo
FROM amazoncorretto:11-al2023-headless
FROM amazoncorretto:17-al2023-headless

# Install the AWS CLI in the container
RUN dnf install -y aws-cli --setopt=install_weak_deps=False && \
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,9 @@
import java.time.Clock;
import java.time.Duration;
import java.util.List;
import java.util.Optional;
import java.util.function.Function;
import java.util.function.Supplier;

import org.opensearch.migrations.bulkload.common.DefaultSourceRepoAccessor;
import org.opensearch.migrations.bulkload.common.DocumentReindexer;
Expand Down Expand Up @@ -259,16 +261,11 @@ public static void main(String[] args) throws Exception {
var connectionContext = arguments.targetArgs.toConnectionContext();


String docTransformerConfig = TransformerConfigUtils.getTransformerConfig(arguments.docTransformationParams);
if (docTransformerConfig != null) {
log.atInfo().setMessage("Doc Transformations config string: {}")
.addArgument(docTransformerConfig).log();
} else {
log.atInfo().setMessage("Using default transformation config: {}")
.addArgument(DEFAULT_DOCUMENT_TRANSFORMATION_CONFIG).log();
docTransformerConfig = DEFAULT_DOCUMENT_TRANSFORMATION_CONFIG;
}
IJsonTransformer docTransformer = new TransformationLoader().getTransformerFactoryLoader(docTransformerConfig);
var docTransformerConfig = Optional.ofNullable(TransformerConfigUtils.getTransformerConfig(arguments.docTransformationParams))
.orElse(DEFAULT_DOCUMENT_TRANSFORMATION_CONFIG);
log.atInfo().setMessage("Doc Transformations config string: {}")
.addArgument(docTransformerConfig).log();
Supplier<IJsonTransformer> docTransformerSupplier = () -> new TransformationLoader().getTransformerFactoryLoader(docTransformerConfig);

try (var processManager = new LeaseExpireTrigger(RfsMigrateDocuments::exitOnLeaseTimeout, Clock.systemUTC());
var workCoordinator = new OpenSearchWorkCoordinator(
Expand All @@ -282,7 +279,7 @@ public static void main(String[] args) throws Exception {
arguments.numDocsPerBulkRequest,
arguments.numBytesPerBulkRequest,
arguments.maxConnections,
docTransformer);
docTransformerSupplier);

SourceRepo sourceRepo;
if (snapshotLocalDirPath == null) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ protected RfsLuceneDocument getDocument(IndexReader reader, int docId, boolean i
int maxDocsPerBulkRequest = 1000;
long maxBytesPerBulkRequest = Long.MAX_VALUE; // No Limit on Size
int maxConcurrentWorkItems = 10;
DocumentReindexer reindexer = new DocumentReindexer(mockClient, maxDocsPerBulkRequest, maxBytesPerBulkRequest, maxConcurrentWorkItems, null);
DocumentReindexer reindexer = new DocumentReindexer(mockClient, maxDocsPerBulkRequest, maxBytesPerBulkRequest, maxConcurrentWorkItems, () -> null);

// Create a mock IDocumentReindexContext
IDocumentMigrationContexts.IDocumentReindexContext mockContext = mock(IDocumentMigrationContexts.IDocumentReindexContext.class);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -280,7 +280,7 @@ public static DocumentsRunner.CompletionStatus migrateDocumentsWithOneWorker(
.host(targetAddress)
.compressionEnabled(compressionEnabled)
.build()
.toConnectionContext()), 1000, Long.MAX_VALUE, 1, defaultDocTransformer),
.toConnectionContext()), 1000, Long.MAX_VALUE, 1, () -> defaultDocTransformer),
new OpenSearchWorkCoordinator(
new CoordinateWorkHttpClient(ConnectionContextTestParams.builder()
.host(targetAddress)
Expand Down
3 changes: 0 additions & 3 deletions MetadataMigration/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@ plugins {
id 'io.freefair.lombok'
}

java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11

dependencies {
implementation project(":coreUtilities")
implementation project(":RFS")
Expand Down
3 changes: 0 additions & 3 deletions RFS/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,6 @@ plugins {
id 'me.champeau.jmh'
}

java.sourceCompatibility = JavaVersion.VERSION_11
java.targetCompatibility = JavaVersion.VERSION_11

ext {
awsSdkVersion = '2.25.16'
dataset = findProperty('dataset') ?: 'skip_dataset'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
import java.util.List;
import java.util.UUID;
import java.util.function.Predicate;
import java.util.function.Supplier;

import org.opensearch.migrations.reindexer.tracing.IDocumentMigrationContexts.IDocumentReindexContext;
import org.opensearch.migrations.transform.IJsonTransformer;
Expand All @@ -23,13 +24,15 @@ public class DocumentReindexer {
private final int maxDocsPerBulkRequest;
private final long maxBytesPerBulkRequest;
private final int maxConcurrentWorkItems;
private final IJsonTransformer transformer;
private final Supplier<IJsonTransformer> transformerSupplier;

public Mono<Void> reindex(String indexName, Flux<RfsLuceneDocument> documentStream, IDocumentReindexContext context) {
// Transformers cannot be used simultaneously
var threadSafeTransformer = ThreadLocal.withInitial(transformerSupplier);
var scheduler = Schedulers.newParallel("DocumentBulkAggregator");
var bulkDocs = documentStream
.publishOn(scheduler, 1)
.map(doc -> transformDocument(doc,indexName));
.map(doc -> transformDocument(threadSafeTransformer, doc,indexName));

return this.reindexDocsInParallelBatches(bulkDocs, indexName, context)
.doOnSuccess(unused -> log.debug("All batches processed"))
Expand All @@ -53,7 +56,8 @@ Mono<Void> reindexDocsInParallelBatches(Flux<BulkDocSection> docs, String indexN
}

@SneakyThrows
BulkDocSection transformDocument(RfsLuceneDocument doc, String indexName) {
BulkDocSection transformDocument(ThreadLocal<IJsonTransformer> transformerLocal, RfsLuceneDocument doc, String indexName) {
var transformer = transformerLocal.get();
var original = new BulkDocSection(doc.id, indexName, doc.type, doc.source, doc.routing);
if (transformer != null) {
final Object transformedDoc = transformer.transformJson(original.toMap());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ class DocumentReindexerTest {
@BeforeEach
void setUp() {
MockitoAnnotations.openMocks(this);
documentReindexer = new DocumentReindexer(mockClient, MAX_DOCS_PER_BULK, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, null);
documentReindexer = new DocumentReindexer(mockClient, MAX_DOCS_PER_BULK, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, () -> null);
when(mockContext.createBulkRequest()).thenReturn(mock(IRfsContexts.IRequestContext.class));
}

Expand Down Expand Up @@ -132,7 +132,7 @@ void reindex_shouldBufferByTransformedSize() throws JsonProcessingException {

// Initialize DocumentReindexer with the transformer
documentReindexer = new DocumentReindexer(
mockClient, numDocs, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, transformer
mockClient, numDocs, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, () -> transformer
);

Flux<RfsLuceneDocument> documentStream = Flux.range(1, numDocs)
Expand Down Expand Up @@ -237,7 +237,7 @@ private RfsLuceneDocument createLargeTestDocument(String id, int size) {
void reindex_shouldRespectMaxConcurrentRequests() {
int numDocs = 100;
int maxConcurrentRequests = 5;
DocumentReindexer concurrentReindexer = new DocumentReindexer(mockClient, 1, MAX_BYTES_PER_BULK_REQUEST, maxConcurrentRequests, null);
DocumentReindexer concurrentReindexer = new DocumentReindexer(mockClient, 1, MAX_BYTES_PER_BULK_REQUEST, maxConcurrentRequests, () -> null);

Flux<RfsLuceneDocument> documentStream = Flux.range(1, numDocs).map(i -> createTestDocument(String.valueOf(i)));

Expand Down Expand Up @@ -275,7 +275,7 @@ void reindex_shouldTransformDocuments() {
IJsonTransformer transformer = new TransformationLoader().getTransformerFactoryLoader(CONFIG);

// Initialize DocumentReindexer with the transformer
documentReindexer = new DocumentReindexer(mockClient, MAX_DOCS_PER_BULK, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, transformer);
documentReindexer = new DocumentReindexer(mockClient, MAX_DOCS_PER_BULK, MAX_BYTES_PER_BULK_REQUEST, MAX_CONCURRENT_REQUESTS, () -> transformer);

// Create a stream of documents, some requiring transformation and some not
Flux<RfsLuceneDocument> documentStream = Flux.just(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ public void updateTargetCluster(
).readDocuments();

final var finalShardId = shardId;
new DocumentReindexer(client, 100, Long.MAX_VALUE, 1, null).reindex(index.getName(), documents, context)
new DocumentReindexer(client, 100, Long.MAX_VALUE, 1, () -> null).reindex(index.getName(), documents, context)
.doOnError(error -> logger.error("Error during reindexing: " + error))
.doOnSuccess(
done -> logger.info(
Expand Down
23 changes: 16 additions & 7 deletions TrafficCapture/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,25 @@ subprojects {
// TODO: Expand to do more static checking in more projects
if (project.name == "trafficReplayer" || project.name == "trafficCaptureProxyServer") {
dependencies {
annotationProcessor group: 'com.google.errorprone', name: 'error_prone_core', version: '2.26.1'
annotationProcessor group: 'com.google.errorprone', name: 'error_prone_core', version: '2.32.0'
}
tasks.named('compileJava', JavaCompile) {
if (project.name == "trafficReplayer" || project.name == "trafficCaptureProxyServer") {
options.fork = true
// Taken from https://errorprone.info/docs/installation
options.forkOptions.jvmArgs += [
'--add-exports', 'jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.main=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.model=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.processing=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED',
'--add-exports', 'jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED',
'--add-opens', 'jdk.compiler/com.sun.tools.javac.code=ALL-UNNAMED',
'--add-opens', 'jdk.compiler/com.sun.tools.javac.comp=ALL-UNNAMED',
]

options.compilerArgs += [
"-XDcompilePolicy=simple",
"-Xplugin:ErrorProne -XepDisableAllChecks -Xep:MustBeClosed:ERROR -XepDisableWarningsInGeneratedCode",
Expand All @@ -25,12 +40,6 @@ subprojects {
allprojects {
apply plugin: 'java'
apply plugin: 'org.owasp.dependencycheck'

java {
toolchain {
languageVersion = JavaLanguageVersion.of(11)
}
}
}

subprojects {
Expand Down
3 changes: 2 additions & 1 deletion TrafficCapture/dockerSolution/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,8 @@ within the final yaml configuration that is being output.

## Compatibility

The tools in this directory can only be built if you have Java version 11 installed.
The tools in this directory can only be built if you have Java version 11-17
installed.

The version is specified in `TrafficCapture/build.gradle` using a Java toolchain, which allows us
to decouple the Java version used by Gradle itself from Java version used by the tools here.
Expand Down
91 changes: 91 additions & 0 deletions TrafficCapture/trafficCaptureProxyServer/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -65,3 +65,94 @@ application {
// Define the main class for the application.
mainClass = 'org.opensearch.migrations.trafficcapture.proxyserver.CaptureProxy'
}

tasks.register("verifyCompatibilityWithJDK11") {
AndreKurait marked this conversation as resolved.
Show resolved Hide resolved
group = "verification"
description = "Verify Java major versions of compiled classes and dependencies for compatibility with JDK 11."

// Declare task inputs
inputs.files(fileTree("${buildDir}/classes/java/main")).withPropertyName("compiledClasses")
inputs.files(configurations.compileClasspath).withPropertyName("compileClasspath")
inputs.files(configurations.runtimeClasspath).withPropertyName("runtimeClasspath")

// Declare outputs (optional if no files are written, just up-to-date checking)
outputs.upToDateWhen { true }

doLast {
// Task implementation logic
def isCompatible = true
def analyzedFiles = [] // To avoid duplicate processing

def getMajorVersion = { InputStream inputStream ->
inputStream.skip(6)
def majorVersion = inputStream.read() << 8 | inputStream.read()
inputStream.close()
return majorVersion
}

def processClassFile = { InputStream inputStream, String name, String source ->
def majorVersion = getMajorVersion(inputStream)
if (majorVersion > 55) {
isCompatible = false
println "Incompatible class file in ${source}: ${name} (Java major version ${majorVersion})"
}
}

def analyzeCompiledClasses = {
def outputDir = file("${buildDir}/classes/java/main")
if (outputDir.exists()) {
println "Analyzing compiled classes in ${outputDir}..."
fileTree(dir: outputDir, include: '**/*.class').forEach { classFile ->
classFile.withInputStream { inputStream ->
processClassFile(inputStream, classFile.name, outputDir.toString())
}
}
} else {
println "No compiled classes found in ${outputDir}."
}
}

def analyzeDependencies = { configuration ->
println "\nAnalyzing ${configuration.name} dependencies..."
configuration.files.each { dependency ->
if (dependency.name.endsWith(".jar") && !analyzedFiles.contains(dependency)) {
analyzedFiles.add(dependency)
def zipStream = new java.util.zip.ZipInputStream(new FileInputStream(dependency))
def entry
while ((entry = zipStream.nextEntry) != null) {
if (entry.name.endsWith(".class")) {
if (entry.name.startsWith("META-INF/versions/")) {
def versionDir = entry.name.split("/")[2]
def versionNumber = versionDir.isInteger() ? versionDir.toInteger() : 0
if (versionNumber > 11) {
continue
}
}
def tempFile = File.createTempFile("class", ".tmp")
tempFile.deleteOnExit()
tempFile.withOutputStream { it << zipStream }
tempFile.withInputStream { inputStream ->
processClassFile(inputStream, entry.name, dependency.name)
}
}
}
zipStream.close()
}
}
}

analyzeCompiledClasses()
analyzeDependencies(configurations.compileClasspath)
analyzeDependencies(configurations.runtimeClasspath)

if (!isCompatible) {
throw new GradleException("Incompatible class files detected! Ensure all classes are compatible with JDK 11 (major version ≤ 55).")
}
}
dependsOn tasks.named("assemble")
}

// Add class version verification to slowTest
tasks.named("slowTest") {
finalizedBy("verifyCompatibilityWithJDK11")
}
Loading
Loading