I've used, and am still using, Julia for my PhD research. It's perfect for parallel/distributed computing, and the neural network primitives are more than enough for my purposes. Anything I write in pure julia runs really, really fast, and has great profiling tools to improve performance further.
Julia also integrates with python, with stuff like PythonCall.jl. I've gotten everything to work so far, but it hasn't been smooth. The python code is always the major bottleneck though, so I try to avoid it.
Overall, julia is a significantly better language in every single aspect except for ecosystem and the occassional environment issue, which you'll get with conda often anyways. It's really a shame that practically nobody actually cares about it compared to python. It supports multi-dimensional arrays as a first-class citizen, which means that each package doesn't have it's own array like torch, numpy, etc, and you don't have to constantly convert between the types.
frakt0x90 38 days ago [-]
I agree on all points. I have used Python for 15 years, Julia for 3 and reach for Julia most of the time for personal projects. I was really stoked when at work the only FOSS solver for our problem was in Julia so we wrote the rest in it for easy integration. The only thing I dread is having to look for a new package since the ecosystem can be quite fragmented.
SirHumphrey 38 days ago [-]
I actually find that ecosystem for Julia is not that big of an issue for me. I guess that my specific use-case (data analysis, numerical simulations) is probably the most developed part of the ecosystem, but regarding that I find that the ecosystem is much more homogeneous than for example python - most things work with most other things (eg units or measurement uncertainties libraries work automatically with a piloting library).
leephillips 37 days ago [-]
As you probably know, that almost magical interworking of libraries is a consequence of Julia’s multiple dispatch and type system:
Yes, it's a wonderful language for scientific computing - mostly by the venture of being designed from the ground up for this role, and not retroactively ported in (like python).
shwouchk 38 days ago [-]
I actually prefer the smaller community of the jusia ecosystem. For one thing, when searching for issues online youre much less likely to hit spam “tutorials” or eternal september stuff.
And other peoples code is actually a pleasure to read
Bostonian 38 days ago [-]
Your last sentence applies equally to Fortran. How would you compare Julia and Fortran?
ted_dunning 38 days ago [-]
Julia adds some pretty amazing stuff with multiple dispatch and run-time compilation. What this means is that you can glue code together in ways impossible for other languages.
One example is a system that I built using three libraries. One was a C library from Postgres for geolocation, another was Uber's H3 library (also C) and a third was a Julia native library for geodesy. From Julia, I was able to extend the API of the H3 library and the Postgres library so that all three libraries would inter-operate transparently. This extension could be done without any mods to the packages I was important.
Slightly similar, if you have a magic whizbang way of looking at your data as a strange form of matrix, you can simply implement a few optimized primitive matrix operations and the standard linear algebra libraries will now use your data structure. Normal languages can't really do that.
More on that second case and the implications in the following video:
Julia is generally higher level than Fortran, with syntax inspired by Python/R/Matlab. We've been able to reliably hire Math PhDs and quickly get them productive in Julia, which would take much longer with Fortran.
realo 38 days ago [-]
Julia uses LLVM for its jit architecture, if I recall correctly.
That makes it a good candidate for running well on ARM platforms (think embedded data processing at the edge).
Not sure how well fortran does on ARM.
pjmlp 38 days ago [-]
Fortran does quite well on almost any major CPU since 1950's, including GPUs.
Actually one of the reasons CUDA won the hearts of researchers over OpenCL, is that Khronos never cared for Fortran, and even C++ was late to the party.
I attended one Khronos webminar where the panel was puzzled with a question from the audience regarding Fortran support roadmap.
NVidia is sponsoring the work on the LLVM Fortran frontend, so same applies.
“sponsoring” in this case means writing nearly all of it ourselves (although we’ve had lots of help from Arm and some others on specific areas like OpenMP).
pjmlp 38 days ago [-]
I see, I do follow LLVM conference talks, but not that deep.
leephillips 37 days ago [-]
And because it runs on arm (there are official arm binaries at julialang.org) you can run Julia on your phone:
Julia is dynamically typed, has a very rich type system, powerful metaprogramming and polymorphism tools.
Julia also has an active thriving ecosystem, and an excellent package manager.
tmvphil 38 days ago [-]
As someone working with it day to day, coming from around 18 years of mostly python, I wish I could say my experience has been great. I find myself constantly battling with the JIT and compilation and recompilation and waiting around all the time (sometimes 10 to 15 minutes for some large projects). Widespread macro usage makes stack traces much harder to read. Lack of formal interfaces means a lot of static checking is not practical. Pkg.jl is also not great, version compatibility is kind of tacked on and has odd behavior.
Obviously there are real bright spots too, with speed, multiple dispatch, a relatively flourishing ecosystem, but overall I wouldn't pick it up for something new if given the choice. I'd use Jax or C++ extensions for performance and settle on python for high level, despite its obvious warts.
catgary 38 days ago [-]
Yeah, Jax with Equinox, jaxtyping, and leaning hard on python’s static typing modules + typeguard lets you pretend that you have a nice little language embedded in python. I swore off Julia a few years ago.
nsajko 37 days ago [-]
> Pkg.jl is also not great, version compatibility is kind of tacked on and has odd behavior.
Huh? I think Pkg is very good as far as package managers go, exceptionally so. What specifically is your issue with it?
jakobnissen 38 days ago [-]
It would be much more useful to see metrics that aren't cumulative if we're interested in growth.
Cumulative measurements, by definition, will never decrease, even if Julia were to fall in popularity.
tpoacher 38 days ago [-]
indeed; something like an h5-index would be interesting to see.
cbruns 38 days ago [-]
I am a MATLAB and Python user who has flirted with julia as a replacement. I don't love the business model of JuliaHub, which feels very similar to Mathworks in that all the cool toolboxes are gated behind a 'contact sales' or high priced license. The free 20 hours of cloud usage is a non-starter. Also it seems that by default, all JuliaHub usage is default cloud-based? on-prem and airgapped (something I need) is implied to be $$$.
Open sourcing and maintaining some components of things like JuliaSim or JuliaSim Control might expand adoption of Julia for people like me. I will never be able to convince my company to pay for JuliaHub if their pricing is similar to Mathworks.
adgjlsfhk1 38 days ago [-]
I don't think this is really a good comparison. Matlab is a $150 for a personal license for the language itself ($1000/year for commercial), and if you want any packages, all the packages are extra on top of that. Julia is fully open source, and has a strong open source package ecosystem (I think we're up to 10k packages by now). Juliahub provides some enterprise systems like JuliaSim, but the language itself is totally free.
Kalanos 38 days ago [-]
With some serious repositioning, I think there is still an opportunity for Julia to displace Python tools like polars/pandas/numpy, airflow, and pytorch -- with a unified ecosystem that makes it easy to transition to GPU and lead a differentiable programming revolution. They have the brain power to do it.
The future of Python's main open source data science ecosystem, numfocus, does not seem bright. Despite performance improvements, Python will always be a glue language. Python succeeds because the language and its tools are *EASY TO USE*. It has nothing to do with computer science sophistication or academic prowess - it humbly gets the job done and responds to feedback.
In comparison to mojo/max/modular, the julia community doesn't seem to be concerned with capturing share from python or picking off its use cases. That's the real problem. There is room for more than one winner here. However, have the people that wanted to give julia a shot already done so? I hope not because there is so much richness to their community under the hood.
catgary 38 days ago [-]
Julia has really lost the differentiable programming mindshare to JAX. I’ve spent weeks or months getting tricky gradients to work in Julia, only to have everything “just work” in JAX. The quality of the autograd is night and day, and goes down to the basic design decisions of the respective “languages” (in the sense that JAX jit compiles a subset of Python) and their intermediate representations.
Fundamentally, when you keep a tight, purely functional core representation of your language (e.g. jaxpr’s) and decompose your autograd into two steps (forward mode and a compiler-level transpose operation) you get a system that is substantially easier to guarantee correct gradients, is much more composable, and even makes it easier to define custom gradients.
Unfortunately, Julia didn’t actually have any proper PLT or compilers people involved in the outset. This is the original sin I see as someone with an interest in autograd. I’m sure someone more focused on type theory has a more cogent criticism of their design decisions in that domain and would identify a different “original sin”.
In the end, I think they’ve made a nice MatLab alternative but there’s a hard upper bound on what they can reach.
affinepplan 38 days ago [-]
> Julia didn’t actually have any proper PLT or compilers people involved in the outset.
while I don't disagree that currently JAX outshines Julia's autodiff options in many ways, I think comments like this are 1. false 2. rude and 3. unnecessary to make your point
catgary 38 days ago [-]
Julia was a scientific computing language made by scientific computing experts. They did a great job on some things, but whiffed a few major decisions early on.
affinepplan 38 days ago [-]
It's a general purpose language made by experts in a myriad of subjects.
catgary 38 days ago [-]
I’m sorry, but I’m going to disagree with you on that. Can you point to any of the language designers who had a background in programming language theory? The closest thing I see is Bezanson’s work on technical computing, which seems laser-focused on array programming. I don’t really see anything related to types or program transformations.
nsajko 37 days ago [-]
> whiffed a few major decisions early on
Anything particular in mind?
patrick451 37 days ago [-]
The always on jit was a big mistep (IMO, the opt-in torchscript model is much better). I tried a julia a few times and it was just too slow to be usable for anything remotely exploratory. Every year or so, I'd read "TTFP has been improved", so I'd try again and it was still slow as mollasas in siberia. I suspect a lot of people had that experience and will be hard pressed to give julia a real shot at this point, even it it does/has fix the problem.
catgary 37 days ago [-]
In general, I’d say there’s too much superficial flexibility but not enough control.
- I wrote this elsewhere: I find their approach to memory management/mutable arrays really hits the worst of both worlds (manual memory management and garbage collection). You end up trying to preallocate memory but don’t actually have control over memory allocations. I find the dynamic type system exacerbates this.
- It’s a very big language, even in the IR. So proper program transforms like mapping functions or autograd are quite difficult to implement.
- Static compilation is really hard, which makes it a non-starter for a lot of domains where it could have made inroads (robotics, games, etc).
> The future of Python's main open source data science ecosystem, numfocus, does not seem bright. Despite performance improvements, Python will always be a glue language.
Your first sentence is a scorching hot take, but I don't see how it's justified by your second sentence.
The community always understood that python is a glue language, which is why the bottleneck interfaces (with IO or between array types) are implemented in lower-level languages or ABIs. The former was originally C but often is now Rust, and Apache Arrow is a great example of the latter.
The strength of using Python is when you want to do anything beyond pure computation (e.g. networking) the rest of the world already built a package for that.
Kalanos 38 days ago [-]
So without the two-lang problem, I think all of these low-level optimization efforts across dataframes, tensors, and distributed computing would be part of a unified ecosystem based on shared compatibility.
For example, the reason why numfocus is so great is that everything was designed to work with numpy as its underlying data structure.
culebron21 37 days ago [-]
My experience with Julia was good, and the language is convenient, however two major factors made me not use it after test projects:
1. Very scarce packages ecosystem. Like there's dataframes.jl file with poor mans implementation of Pandas.
2. Recompiling everything every time. It meant that a Julia program in some script would take ~40 seconds compiling with dataframes & some other important packages.
I think if a language is to replace Python in science, it would need to either be very fast (recompilation on every run breaks this, and running Julia in a notebook/shell is interesting, but outside of pure scientific code, it should be easier to re-run it), or it should offer ergonomics. Pandas has very rough corners, especially when you need grouping with nontrivial operations, or grouped window functions. Joins aren't easy either. Any system that makes this more ergonomic, could bite a bit off Python. But I don't see such.
dizk12 37 days ago [-]
I think the Tidier.jl has made data in julia quite nice. TidierData leverages Dataframes.jl on the backend but with tidyverse syntax, and TidierDB.jl recreates python's ibis and Rs dbplyr. TidierDB (DuckDB is the main backend, but supports 11 other DB backends) also enables grouped window functions quite smoothly as well as joins and non equijoins.
Your comment is exceedingly misleading. Whether and when Julia code gets compiled is up to the user.
NeutralForest 38 days ago [-]
I like the language but I can't help but feel it missed the train and that the ergonomics improvements it offers are too small to switch over from Python.
dv_dt 38 days ago [-]
It does feel like julia will not make the leap to displace python, but for a long time python offered too few improvements over perl, so its not completely out of the question.
fnands 37 days ago [-]
Yeah, at some point I was hoping it would. But after attending a day of JuliaCon last year: Julia is nowhere close to replacing Python, but it might put MATLAB in the ground.
It looks like Julia has found a few niches though: HPC and numerical modelling among them.
pjmlp 38 days ago [-]
Depends on which train Julia folks want to board into.
NeutralForest 38 days ago [-]
It felt to me like they wanted to be the language for ML/DL, which they haven't achieved. They clearly have been working more towards scientific stuff + ML, all the differential equations and math packages are a testament to that (as well as the pharma stuff with Puma).
I'm not aware of what the vision is currently tbh
mbauman 38 days ago [-]
The key for me — as someone who has been around for a long time and is at JuliaHub — is that Julia excels most at problems that don't already have an efficient library implementation.
If your work is well-served by existing libraries, great! There's no need to compete against something that's already working well. But that's frequently not the case for modeling, simulation, differential equations, and SciML.
catgary 38 days ago [-]
The ODEs stuff in Julia is nice, but I think diffusers/JAX is a reasonable backbone to copy over whatever you need from there. I do think Julia is doing will in stats and has gotten some mindshare from R in that regard.
But I think a reasonably compentent Python/JAX programmer can roll out whatever they need relatively easily (especially if you want to use the GPU). I do miss Tullio, though.
ssivark 38 days ago [-]
But how does one get the composability of multiple dispatch? A problem with the Python world IME is the need to reinvent everything in Jax ecosystem, in the Pytorch ecosystem, etc -- because of fundamental language limitations. This is tedious, and feels like a lot of wasted effort.
Another example: It's frustrating that Flax had to implement it's own "lifted" transformations instead of being able to just use jax transformations -- which makes it impossible to just slot a Flax model into a jax library that integrates ODEs. Equinox might be better on this front, but that means that all the models now need to be re-implemented in Equinox. The fragmentation and churn in the Python ecosystem is outrageous -- the only reason it doesn't collapse under its own weight is how much funding and manpower ML stakeholders are able to pour into the ecosystem.
Given how much the ecosystem depends on that sponsored effort, the popular frameworks will likely prioritize ML applications, and corollary use cases will be second class citizens in case of design tradeoffs. Eg: framework overheads matter less when one is trying to use large NN models -vs- when one is trying to use small models, or other parametric approaches.
catgary 38 days ago [-]
So, I’ll admit that I’m not a fan of multiple dispatch a la Julia. I much prefer typeclasses and explicit union types. Also, I found Julia is really the worst of both worlds of garbage collected languages and manual memory management because it ostensibly has a garbage collector but you find yourself basic preallocating memory or faffing around with StaticArrays trying to figure out why memory isn’t being allocated (often it comes down to some type instability nonsense because the type system built around multiple dispatch can’t correctly type the program). At this point I’d rather just use C++ or Rust than Julia, I’m getting annoyed just thinking about the nonsense I used to deal with.
Also, IIRC, it’s not terribly difficult to use flax with equinox. It’s just a matter of storing the weight dict and model function in an equinox module. Filter_jit will correctly recognize the weights as a dynamic variable and the flax model as a static variable.
lagrange77 38 days ago [-]
> But I think a reasonably compentent Python/JAX programmer can roll out whatever they need relatively easily
You mean in terms of the ODE stuff, Julia provides?
catgary 38 days ago [-]
Diffusers is pretty well done (I think the author was basically rewriting some Julia libraries and adapting them to JAX). I can’t imagine it being too hard to adapt most SciML ODE solvers.
For simulations, JAX will choke on very “branchy” computations. But, honestly I’ve had very little success differentiating through those computations in the first place and they don’t run well on the GPU. Thus, I’m generally inclined to use wrappers around C++ (or ideally Rust) for those purposes (my use-case is usually some rigid-body dynamics style simulation).
38 days ago [-]
affinepplan 38 days ago [-]
I think one really good use case is complex simulations.
38 days ago [-]
joshlk 38 days ago [-]
According to Stackoverflow trends, Julia’s popularity is decreasing and very small
That's mostly because Julia questions get answered on its Discourse or Slack. The sharp decline is due to an automatic cross-post bot that stopped working.
No one bothered fixing it, in great part due to Discourse being the main place of discussion, as far as I know.
NeutralForest 38 days ago [-]
Even languages like Python and Javascript who are huge show a decline after 2022 which suggests ChatGPT is probably responsible. It would be better to have some other measure imo.
joshlk 38 days ago [-]
It measures the proportion of questions for that language out of all languages. So, if there is a general decline in Stackoverflow questions, it’s already accounted for in the metric
38 days ago [-]
NeutralForest 38 days ago [-]
There are too many confounding factors still.
eigenspace 38 days ago [-]
Julia users don't go to Stack Overflow because we have better options.
mjgant 38 days ago [-]
Or thats the LLM/ChatGPT effect. Can see similar downtrends with other languages
veqq 38 days ago [-]
Stackoverflow's popularity's decreased a lot, many communities have entirely left.
pjmlp 38 days ago [-]
I love to see Julia growth, if nothing else by being another Dylan like take on Lisp ideas, with a JIT compiler in the box, and the community keeping the effort to overcome tooling issues despite critics.
tajd 38 days ago [-]
Yeah it's interesting to see how it's getting on! I wrote my PhD simulation code in it from the ground up as it had nice fundamental abstractions for parallizable code. Of course now it's just Python and Scala/Java but Julia was great for my purpose.
6gvONxR4sf7o 38 days ago [-]
I do scientific computing and a lisp was one of my first languages, so i feel like i ought to be the target audience, but it just never quite catches me.
It’s almost statically compilable which has almost gotten me to pick it up a few times, but apparently it still can’t compile a lot of the most important ecosystem packages yet.
The metaprogramming has almost gotten me to pick it up a few times, but apparently there aren’t mature static anti-footgun tools, even to the degree of mypy’s pseudo-static analysis, so I wouldn’t really want to use those in prod or even complex toy stuff.
It’s so damned interesting though. I hope it gets some of this eventually.
toolslive 38 days ago [-]
We do statistical modeling in Python in our company. When a statistician asked for R, I said "no, but you can have Julia". He's quite happy with it, and we're planning to move some stuff over.
kayson 38 days ago [-]
I'm curious how people feel about the JIT compilation time vs runtime tradeoff these days. Any good recent benchmarks?
affinepplan 38 days ago [-]
Chapel folk did a really nice benchmark last year including Julia, where it landed pretty much right on the Pareto frontier of code size vs performance
Also if you exclude Julia's compile time (which may or may not be reasonable depending on what you're trying to measure), Julia would gain a lot in speed since almost all of these benchmarks are in the 0.5s to 10s range.
38 days ago [-]
ofrzeta 37 days ago [-]
Somehow Julia is lacking the "killer app" like Ruby has with Rails.
Julia also integrates with python, with stuff like PythonCall.jl. I've gotten everything to work so far, but it hasn't been smooth. The python code is always the major bottleneck though, so I try to avoid it.
Overall, julia is a significantly better language in every single aspect except for ecosystem and the occassional environment issue, which you'll get with conda often anyways. It's really a shame that practically nobody actually cares about it compared to python. It supports multi-dimensional arrays as a first-class citizen, which means that each package doesn't have it's own array like torch, numpy, etc, and you don't have to constantly convert between the types.
https://arstechnica.com/science/2020/10/the-unreasonable-eff...
And other peoples code is actually a pleasure to read
One example is a system that I built using three libraries. One was a C library from Postgres for geolocation, another was Uber's H3 library (also C) and a third was a Julia native library for geodesy. From Julia, I was able to extend the API of the H3 library and the Postgres library so that all three libraries would inter-operate transparently. This extension could be done without any mods to the packages I was important.
Slightly similar, if you have a magic whizbang way of looking at your data as a strange form of matrix, you can simply implement a few optimized primitive matrix operations and the standard linear algebra libraries will now use your data structure. Normal languages can't really do that.
More on that second case and the implications in the following video:
https://www.youtube.com/watch?v=kc9HwsxE1OY
That makes it a good candidate for running well on ARM platforms (think embedded data processing at the edge).
Not sure how well fortran does on ARM.
Actually one of the reasons CUDA won the hearts of researchers over OpenCL, is that Khronos never cared for Fortran, and even C++ was late to the party.
I attended one Khronos webminar where the panel was puzzled with a question from the audience regarding Fortran support roadmap.
NVidia is sponsoring the work on the LLVM Fortran frontend, so same applies.
https://flang.llvm.org/docs/
https://bsky.app/profile/badphysicist.bsky.social/post/3lhfm...
Julia also has an active thriving ecosystem, and an excellent package manager.
Obviously there are real bright spots too, with speed, multiple dispatch, a relatively flourishing ecosystem, but overall I wouldn't pick it up for something new if given the choice. I'd use Jax or C++ extensions for performance and settle on python for high level, despite its obvious warts.
Huh? I think Pkg is very good as far as package managers go, exceptionally so. What specifically is your issue with it?
Open sourcing and maintaining some components of things like JuliaSim or JuliaSim Control might expand adoption of Julia for people like me. I will never be able to convince my company to pay for JuliaHub if their pricing is similar to Mathworks.
The future of Python's main open source data science ecosystem, numfocus, does not seem bright. Despite performance improvements, Python will always be a glue language. Python succeeds because the language and its tools are *EASY TO USE*. It has nothing to do with computer science sophistication or academic prowess - it humbly gets the job done and responds to feedback.
In comparison to mojo/max/modular, the julia community doesn't seem to be concerned with capturing share from python or picking off its use cases. That's the real problem. There is room for more than one winner here. However, have the people that wanted to give julia a shot already done so? I hope not because there is so much richness to their community under the hood.
Fundamentally, when you keep a tight, purely functional core representation of your language (e.g. jaxpr’s) and decompose your autograd into two steps (forward mode and a compiler-level transpose operation) you get a system that is substantially easier to guarantee correct gradients, is much more composable, and even makes it easier to define custom gradients.
Unfortunately, Julia didn’t actually have any proper PLT or compilers people involved in the outset. This is the original sin I see as someone with an interest in autograd. I’m sure someone more focused on type theory has a more cogent criticism of their design decisions in that domain and would identify a different “original sin”.
In the end, I think they’ve made a nice MatLab alternative but there’s a hard upper bound on what they can reach.
while I don't disagree that currently JAX outshines Julia's autodiff options in many ways, I think comments like this are 1. false 2. rude and 3. unnecessary to make your point
Anything particular in mind?
- I wrote this elsewhere: I find their approach to memory management/mutable arrays really hits the worst of both worlds (manual memory management and garbage collection). You end up trying to preallocate memory but don’t actually have control over memory allocations. I find the dynamic type system exacerbates this.
- It’s a very big language, even in the IR. So proper program transforms like mapping functions or autograd are quite difficult to implement.
- Static compilation is really hard, which makes it a non-starter for a lot of domains where it could have made inroads (robotics, games, etc).
EDIT: or 2 products: https://news.ycombinator.com/item?id=42962548
Your first sentence is a scorching hot take, but I don't see how it's justified by your second sentence.
The community always understood that python is a glue language, which is why the bottleneck interfaces (with IO or between array types) are implemented in lower-level languages or ABIs. The former was originally C but often is now Rust, and Apache Arrow is a great example of the latter.
The strength of using Python is when you want to do anything beyond pure computation (e.g. networking) the rest of the world already built a package for that.
For example, the reason why numfocus is so great is that everything was designed to work with numpy as its underlying data structure.
1. Very scarce packages ecosystem. Like there's dataframes.jl file with poor mans implementation of Pandas.
2. Recompiling everything every time. It meant that a Julia program in some script would take ~40 seconds compiling with dataframes & some other important packages.
I think if a language is to replace Python in science, it would need to either be very fast (recompilation on every run breaks this, and running Julia in a notebook/shell is interesting, but outside of pure scientific code, it should be easier to re-run it), or it should offer ergonomics. Pandas has very rough corners, especially when you need grouping with nontrivial operations, or grouped window functions. Joins aren't easy either. Any system that makes this more ergonomic, could bite a bit off Python. But I don't see such.
https://github.com/TidierOrg/TidierDB.jl https://github.com/TidierOrg/TidierData.jl
of note i am biased as a tidier contributor and author of tidierdb.jl but hopefully you might be willing to give it a try.
> recompilation on every run breaks this
Your comment is exceedingly misleading. Whether and when Julia code gets compiled is up to the user.
It looks like Julia has found a few niches though: HPC and numerical modelling among them.
I'm not aware of what the vision is currently tbh
If your work is well-served by existing libraries, great! There's no need to compete against something that's already working well. But that's frequently not the case for modeling, simulation, differential equations, and SciML.
But I think a reasonably compentent Python/JAX programmer can roll out whatever they need relatively easily (especially if you want to use the GPU). I do miss Tullio, though.
Another example: It's frustrating that Flax had to implement it's own "lifted" transformations instead of being able to just use jax transformations -- which makes it impossible to just slot a Flax model into a jax library that integrates ODEs. Equinox might be better on this front, but that means that all the models now need to be re-implemented in Equinox. The fragmentation and churn in the Python ecosystem is outrageous -- the only reason it doesn't collapse under its own weight is how much funding and manpower ML stakeholders are able to pour into the ecosystem.
Given how much the ecosystem depends on that sponsored effort, the popular frameworks will likely prioritize ML applications, and corollary use cases will be second class citizens in case of design tradeoffs. Eg: framework overheads matter less when one is trying to use large NN models -vs- when one is trying to use small models, or other parametric approaches.
Also, IIRC, it’s not terribly difficult to use flax with equinox. It’s just a matter of storing the weight dict and model function in an equinox module. Filter_jit will correctly recognize the weights as a dynamic variable and the flax model as a static variable.
You mean in terms of the ODE stuff, Julia provides?
For simulations, JAX will choke on very “branchy” computations. But, honestly I’ve had very little success differentiating through those computations in the first place and they don’t run well on the GPU. Thus, I’m generally inclined to use wrappers around C++ (or ideally Rust) for those purposes (my use-case is usually some rigid-body dynamics style simulation).
https://trends.stackoverflow.co/?tags=julia
No one bothered fixing it, in great part due to Discourse being the main place of discussion, as far as I know.
It’s almost statically compilable which has almost gotten me to pick it up a few times, but apparently it still can’t compile a lot of the most important ecosystem packages yet.
The metaprogramming has almost gotten me to pick it up a few times, but apparently there aren’t mature static anti-footgun tools, even to the degree of mypy’s pseudo-static analysis, so I wouldn’t really want to use those in prod or even complex toy stuff.
It’s so damned interesting though. I hope it gets some of this eventually.
I know that's not exactly answering your question, but you might be interested https://chapel-lang.org/ChapelCon/2024/chamberlain-clbg.pdf