Q&A (see recording for more in-depth answers) Q: What is Julia's current path for scalable distributed parallelism? Julia+MPI? or have the Julia parallelism constructs matured to the point of not requiring a Julia+X for multi-node computations? A: IMO Julia's best (HPC) distributed story is Julia+MPI right now, but there are ongoing projects (such as Dagger/DaggerGPU/DaggerMPI) which are trying to provide a task-based alternative Q: Julia+MPI works just fine with GPUs? DirectGPU etc. A: I believe so, my understanding is that the MPI implementation handles such details Q: Is there a Julia module on Summit? A: Yes Q: Can you comment on hackathons for/using Julia? A: There is a hackathon at every JuliaCon, which is not specific to HPC but definitely tackles HPC use cases Q: Julia is not meant for backend/library development, right? e.g. create a code in Julia and link to a C/FORTRAN main A: It's not something we have amazing tooling for, but work is in progress to make this possible: https://github.com/tshort/StaticCompiler.jl/ Q: How much does Julia change between versions? Can I expect that the code I write today will still work unchanged 2 years from now? A: Generally changes are mostly backwards-compatible, but there are "minor changes" which are technically breaking. Usually, you won't notice a difference, though. Q: Any comments about JuliaGPU for portable GPU programming? (https://juliagpu.org) (AMD/Nvidia/Intel GPUs.) A: JuliaGPU is fully-focused on supporting those GPUs. Whether it's in a laptop or supercomputer, we support them if the vendor's libraries support them (generally) Q: Anyone knows if MacOS GPUs are supported? A: https://github.com/JuliaGPU/Metal.jl (still WIP) Q: Is Julia good for plotting data? A: Absolutely! We have many plotting packages, and CairoMakie.jl is particularly useful on headless HPC nodes. Plots.jl is an alternative, and was the "original" plotting package for Julia. Additionally, there is a Julia wrapper for Matplotlib in Python. Q: How suitable would Julia be for a portable layer across the vendor-specific GPU packages? A: KernelAbstractions is aimed at being that layer (https://github.com/JuliaGPU/KernelAbstractions.jl) Helpful info: FYI, direct link to Youngsung's recent talk as it came up a few times: https://docs.google.com/presentation/d/1twESwvD7rlw1AjpRdrxjN4bDHjxIkY8C5yDixH7aKyA/edit#slide=id.p If you aren't already, I'd highly recommend being on Julia's Slack channel: https://julialang.org/slack/ Not sure if this will be covered, but Pluto's usage of plain Julia files is important when doing version control with git (where Jupyter notebooks are very unsuitable, being glorified JSON files on disk)