Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Difficulty getting started #3

Open
arnavm opened this issue Sep 29, 2023 · 0 comments
Open

Difficulty getting started #3

arnavm opened this issue Sep 29, 2023 · 0 comments

Comments

@arnavm
Copy link

arnavm commented Sep 29, 2023

Hi,

I'm having some trouble getting RS-FISH-Spark working locally. One issue I've been able to troubleshoot, the other one is a little bit harder to debug:

  1. Installation (on MacOS 12.7)
> mvn clean package
[INFO] Scanning for projects...
[INFO]
[INFO] --------------------< net.preibisch:RS-FISH-Spark >---------------------
[INFO] Building RS-FISH Spark 0.0.1-SNAPSHOT
[INFO]   from pom.xml
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- clean:3.1.0:clean (default-clean) @ RS-FISH-Spark ---
[INFO]
[INFO] --- enforcer:3.0.0-M3:enforce (enforce-rules) @ RS-FISH-Spark ---
[INFO] Adding ignore: module-info
[INFO] Adding ignore: META-INF/versions/*/module-info
[INFO] Adding ignore: com.esotericsoftware.kryo.*
[INFO] Adding ignore: com.esotericsoftware.minlog.*
[INFO] Adding ignore: com.esotericsoftware.reflectasm.*
[INFO] Adding ignore: com.google.inject.*
[INFO] Adding ignore: jnr.ffi.*
[INFO] Adding ignore: org.apache.hadoop.yarn.*.package-info
[INFO] Adding ignore: org.apache.spark.unused.UnusedStubClass
[INFO] Adding ignore: org.hibernate.stat.ConcurrentStatisticsImpl
[INFO] Adding ignore: org.jetbrains.kotlin.daemon.common.*
[INFO] Adding ignore: org.junit.runner.Runner
[INFO] Adding ignore: module-info
[INFO] Adding ignore: module-info
[INFO]
[INFO] --- build-helper:3.0.0:regex-property (sanitize-version) @ RS-FISH-Spark ---
[INFO]
[INFO] --- build-helper:3.0.0:regex-property (guess-package) @ RS-FISH-Spark ---
[INFO]
[INFO] --- buildnumber:1.4:create (default) @ RS-FISH-Spark ---
[INFO] Executing: /bin/sh -c cd '/Users/arnav/Downloads/RS-FISH-Spark-main' && 'git' 'rev-parse' '--verify' 'HEAD'
[INFO] Working directory: /Users/arnav/Downloads/RS-FISH-Spark-main
[INFO] Storing buildNumber: UNKNOWN at timestamp: 1696003952159
[WARNING] Cannot get the branch information from the git repository:
Detecting the current branch failed: fatal: not a git repository (or any of the parent directories): .git

[INFO] Executing: /bin/sh -c cd '/Users/arnav/Downloads/RS-FISH-Spark-main' && 'git' 'rev-parse' '--verify' 'HEAD'
[INFO] Working directory: /Users/arnav/Downloads/RS-FISH-Spark-main
[INFO] Storing buildScmBranch: UNKNOWN_BRANCH
[INFO]
[INFO] --- scijava:2.0.0:set-rootdir (set-rootdir) @ RS-FISH-Spark ---
[INFO] Setting rootdir: /Users/arnav/Downloads/RS-FISH-Spark-main
[INFO]
[INFO] --- jacoco:0.8.6:prepare-agent (jacoco-initialize) @ RS-FISH-Spark ---
[INFO] argLine set to -javaagent:/Users/arnav/.m2/repository/org/jacoco/org.jacoco.agent/0.8.6/org.jacoco.agent-0.8.6-runtime.jar=destfile=/Users/arnav/Downloads/RS-FISH-Spark-main/target/jacoco.exec
[INFO]
[INFO] --- resources:3.1.0:resources (default-resources) @ RS-FISH-Spark ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 98 resources
[INFO]
[INFO] --- compiler:3.8.1:compile (default-compile) @ RS-FISH-Spark ---
[INFO] Compiling 20 source files to /Users/arnav/Downloads/RS-FISH-Spark-main/target/classes
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  13.339 s
[INFO] Finished at: 2023-09-29T09:12:33-07:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project RS-FISH-Spark: Fatal error compiling: error: release version 1.8 not supported -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

This I was able to fix by un-commenting line 91 in pom.xml. The package builds successfully.

  1. Execution (using the supplied test.n5 image)
> java -cp target/RS-Fish-jar-with-dependencies.jar -Xmx12G "-Dspark.master=local[4]" net.preibisch.rsfish.spark.SparkRSFISH --image=src/main/resources/test.n5 --dataset=/N2-702-ch0/c0/s0 --minIntensity=0 --maxIntensity=4096 --anisotropy=0.7 --output=points.csv
--image=src/main/resources/test.n5 --dataset=/N2-702-ch0/c0/s0 --minIntensity=0 --maxIntensity=4096 --anisotropy=0.7 --output=points.csv
N5/HDF5/ZARR dataset dimensionality: 3
N5/HDF5/ZARR dataset size: (334, 454, 81)
Processing interval: [0, 0, 0] -> [333, 453, 80], dimensions (334, 454, 81)
Processing blocksize: (128, 128, 64)
09:18:24.887 [main] INFO org.apache.spark.SparkContext - Running Spark version 2.4.7
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
09:18:25.087 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
09:18:25.097 [main] DEBUG org.apache.hadoop.util.PerformanceAdvisory - Falling back to shell based
Exception in thread "main" java.lang.ExceptionInInitializerError
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79)
	at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:93)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
	at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
	at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:293)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at net.preibisch.rsfish.spark.SparkRSFISH.call(SparkRSFISH.java:211)
	at net.preibisch.rsfish.spark.SparkRSFISH.call(SparkRSFISH.java:32)
	at picocli.CommandLine.executeUserObject(CommandLine.java:1853)
	at picocli.CommandLine.access$1100(CommandLine.java:145)
	at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2255)
	at picocli.CommandLine$RunLast.handle(CommandLine.java:2249)
	at picocli.CommandLine$RunLast.handle(CommandLine.java:2213)
	at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2080)
	at picocli.CommandLine.execute(CommandLine.java:1978)
	at net.preibisch.rsfish.spark.SparkRSFISH.main(SparkRSFISH.java:338)
Caused by: java.lang.StringIndexOutOfBoundsException: Range [0, 3) out of bounds for length 2
	at java.base/jdk.internal.util.Preconditions$1.apply(Preconditions.java:55)
	at java.base/jdk.internal.util.Preconditions$1.apply(Preconditions.java:52)
	at java.base/jdk.internal.util.Preconditions$4.apply(Preconditions.java:213)
	at java.base/jdk.internal.util.Preconditions$4.apply(Preconditions.java:210)
	at java.base/jdk.internal.util.Preconditions.outOfBounds(Preconditions.java:98)
	at java.base/jdk.internal.util.Preconditions.outOfBoundsCheckFromToIndex(Preconditions.java:112)
	at java.base/jdk.internal.util.Preconditions.checkFromToIndex(Preconditions.java:349)
	at java.base/java.lang.String.checkBoundsBeginEnd(String.java:4861)
	at java.base/java.lang.String.substring(String.java:2830)
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:50)
	... 26 more

I'm not sure where this out of bounds exception is coming from, nor how best to proceed. Any suggestions would helpful. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant