Can anyone think of a reason why Stdout.write
would flush but Stdout.line
doesn't?
#[no_mangle]
pub extern "C" fn roc_fx_stdoutLine(line: &RocStr) {
let string = line.as_str();
println!("{}", string);
}
#[no_mangle]
pub extern "C" fn roc_fx_stdoutWrite(text: &RocStr) {
let string = text.as_str();
print!("{}", string);
std::io::stdout().flush().unwrap();
}
Cause somebody needed it to flush for a clic game or something
At least, I think that is why
Probably should add a separate flush effect
I can do that, just add a separate flush Effect. Would it be acceptable for it to flush stdout and stderr with the same or better to have two separate ones?
Like Stdout.flush
and Stderr.flush
I guess it should handle the errors too, like if not all bytes could be written due to I/O errors or EOF being reached. So more like Stdout.flush : Task {} [IOError, EOF]
or something
Yeah, should be separate.
Also, technically even write and line should also have error returns, but I'm not sure what our decision around that for basic CLI was. I think we were ignoring some of those kinds of errors for simplicity of the platform, but I guess it is up to us overall to decide.
I think for most simple platforms, people just care about std in returning errors so they can recognize eof.
Currently implementing error handling for flush, would be easy to add RN if we are happy with that API change
Brendan Hansknecht said:
Also, technically even write and line should also have error returns, but I'm not sure what our decision around that for basic CLI was. I think we were ignoring some of those kinds of errors for simplicity of the platform, but I guess it is up to us overall to decide.
yeah I think if you want to write with error handling, that should be the unusual case
it should be supported, but it shouldn't be the default because that's not what people want 99% of the time :big_smile:
Having played with this a bit I think it makes sense. My logic is;
Task.attempt
instead of Task.await
, so we can always just ignore the result if we don't care about it. For example;# Log request date, method and url
date <- Utc.now |> Task.map Utc.toIso8601Str |> Task.await
_ <- Stdout.line "\(date) \(Http.methodToStr req.method) \(req.url)" |> Task.attempt
_ <- Stdout.flush |> Task.attempt
I'm pretty sure it's normal stdout behavior to automatically flush when it encounters a newline: https://stackoverflow.com/a/65385249
I don't think Stdin.line
or Stdin.bytes
make sense for basic-webserver context so I am removing these
Any idea why this generated glue assertion might be incorrect? Is this an easy fix?
Screenshot-2023-11-19-at-14.16.02.png
My current workaround it to comment out this line for each architecture.
difference between platforms maybe (I notice you are on aarch)
That was in the platform/src/glue_manual/src/aarch64.rs
file
I'm working on this https://github.com/roc-lang/basic-webserver/pull/8 PR
Luke Boswell said:
I don't think
Stdin.line
orStdin.bytes
make sense for basic-webserver context so I am removing these
I don't think Stdin
makes sense for basic-webserver in general :big_smile:
Maybe you want to pipe in something to serve? Seems unusual though
I'm currently coming up against this which is causing me troubles making a RocDict
impl<K: Hash, V> RocDict<K, V> {
unsafe fn insert_unchecked(&mut self, _key: K, _val: V) {
todo!();
}
}
I'm going to temporarily change the API for Env.dict : Task (Dict Str Str) *
to Env.list: Task (List (Str, Str)) *
as a workaround as I have been unable to make a RocDict from rust
Did we ever implement RocDict
in rust?
Aside, Dict may be one of the most painful pieces for platforms/glue to implement. Especially since they have to depend on a ton of implementation details (like the hashing function among others)
Luke Boswell said:
Can anyone think of a reason why
Stdout.write
would flush butStdout.line
doesn't?#[no_mangle] pub extern "C" fn roc_fx_stdoutLine(line: &RocStr) { let string = line.as_str(); println!("{}", string); } #[no_mangle] pub extern "C" fn roc_fx_stdoutWrite(text: &RocStr) { let string = text.as_str(); print!("{}", string); std::io::stdout().flush().unwrap(); }
Since stdout is line-buffered by default, doesn't println!
flush automatically?
I think https://github.com/roc-lang/basic-cli/releases/tag/0.6.1 somehow regressed, with calls to Http now failing with 301 for some reason... I'm just investigating now
I'm wondering if we need to regenerate glue, I do have a PR for that but it hasn't passed CI yet.
$ echo "http://roc-lang.org" | roc run examples/http-get.roc
🔨 Rebuilding platform...
Enter a URL to fetch. It must contain a scheme like "http://" or "https://".
Request failed with status 301
Building from source it works just fine for me
301 is expected for http://roc-lang.org
it is redirecting to https://www.roc-lang.org
I get the same with "echo http://www.roc-lang.org" | roc run examples/http-get.roc
:face_palm:
Thank you @Brendan Hansknecht
I hadn't paid enough attention to the http
vs https
too. Also, there was a bug in the way I was using Dir.list
.
I found this https://sans-io.readthedocs.io/ and thought it may be a good reference for future implementations in pure Roc.
There is an issue in basic-webserver CI I haven't seen before -- it should be totally unrelated to the changes in that PR. The surgical linker is unhappy with __umodti3
. I've had a quick look and haven't resolved yet, posting here in case anyone else is able to look at this further. I'll circle back later, but want to progress some other things rn.
thread 'main' panicked at 'Undefined Symbol in relocation, (+4ef2, Relocation { kind: PltRelative, encoding: Generic, size: +20, target: Symbol(SymbolIndex(+88)), addend: +fffffffffffffffc, implicit_addend: false }): Ok(Symbol { name: "__umodti3", address: +0, size: +0, kind: Unknown, section: Undefined, scope: Unknown, weak: false, flags: Elf { st_info: +10, st_other: +0 } })', crates/linker/src/elf.rs:1486:25
I cannot reproduce the failures in CI for https://github.com/roc-lang/roc/pull/6415
I've tried all kinds of different things, am tempted to try re-run CI and maybe that will resolve things? otherwise I'm out of ideas
27 messages were moved from this topic to #ideas > package shorthands when compiling interface modules by Richard Feldman.
I've been looking at fixing a bug regarding aliasing option record fields. The core of this seems to be a parsing error. I've read through the parser file, but still not sure how to modify the parser (and any implications after that). Are there any other places I can read to get some more context on how to implement this fix?
I am not very educated on the parser, but the root of the issue is here: https://github.com/roc-lang/roc/blob/9bf57d63370f69093c5ce2660d39cafe9a424aa7/crates/compiler/parse/src/pattern.rs#L494-L500
You can see that it is select either of the two instead of optionally accepting :
and then optionally accepting ?
If I understand correctly, which id definitely might not, I think the changes would just be parser local. Cause you are just changing the variable name of the parser output when renaming and optional
Checkout the :cool: diagram I made for my basic-cli build script :sunglasses:
https://github.com/roc-lang/basic-cli/tree/refactor-host?tab=readme-ov-file#building-the-platform
Luke Boswell said:
Checkout the :cool: diagram I made for my basic-cli build script :sunglasses:
https://github.com/roc-lang/basic-cli/tree/refactor-host?tab=readme-ov-file#building-the-platform
What did you make it with?
PowerPoint
Classic
Quite helpful! That looks familiar. Was tat from one your your recent meetup presentations?
No, I just put it together yesterday for Anton. He wanted something for the new basic-cli. I've got a PR that should hopefully land soon which refactors the rust code out into crates from the roc platform code. So to build the platform we use rust's cargo toolchain. I added a build script using (ironically a release of) basic-cli to drive all the different steps.
This is in preparation for removing the platform rebuilding from the roc cli. Platforms should be responsible for building the binaries for their host using the native toolchains, i.e. zig, cargo, etc. Roc shouldn't need to have dependencies on any of this. Roc should only need to understand how to link the host binaries using one of the linkers (legacy, additive or surgical).
FYI -- I've got family staying with me this weekend (including 4x under the age of 5) so I don't expect I'll get much time to dig into to roc things. :beach_umbrella:
nice, enjoy the family time! :smiley:
Git noob question -- can anyone tell me why this PR has a bunch of unrelated commits in the history in GH? https://github.com/roc-lang/roc/pull/6859
I suspect it is because I merged in remote/main. But those other commits are from previous PR's that have been merged already
This is why I hate merge and always rebase if possible
I just did something super ugly, and rebased then squished
Seems to be happy now
How do we feel about using resources from a CDN for examples?
I was thinking of adding a small text editor example to basic-webserver using htmx and quilljs, with the sqlite backend.
The examples should be reflective of how we think people would write real-life Roc code, just smaller. If you think it's better for someone to depend on a CDN instead of including the files in the Roc binary, then that works for me.
It seems like that would be useful if they want to keep a smaller binary size.
The other place this kind of example could live is in roc-lang/examples. It's not necessarily specific to basic-webserver in that it brings in other resources and is somewhat opinionated about how to do things (htmx, quill etc), so just demonstrates one way to do things.
If I'm getting super excited I might even pull in tailwindcss from a CDN to make it look ok
Or I guess I could just make it a blog post. I'm not sure how we'd make an expect test for this in CI -- and it's more stuff to maintain.
can we just host them on GH pages in one of our repos or something?
so it's a CDN but it's a CDN we know isn't going to 404 without our knowing about it :big_smile:
I could add them to the static files hosted on the roc website.
If we do that, it would make more sense for this kind of example to live in roc-lang/examples
I just don't know if we want an example like this and have the additional burden of maintaining it.
I could just make a blog post, and then we link it under "external" examples.
How do we draw that line?
external seems fine for now I guess, as long as we're using a CDN that seems likely to be up for a long time
small text editor example to basic-webserver using htmx and quilljs, with the sqlite backend.
This seems quite niche, I would put it in its own repo and link to it on roc-awesome
Turned out to be really simple, and quite small. So I just shared in a gist in show and tell.
In Germany and Austria, we had programs, that people, that used Google fonds from the Google servers got sued for privacy reasons. The cases came to a good end. But it is still not decided, if it is legal in Europe to use Google fonts or for the same reason a CDN, when the data is processed outside of Europe.
I had cases in my circle of friends, where they got an "Abmahnung". (I have no idea how to translate this. This is a really problematic legal construct we have here.) I am absolutely sure, that a high Court will allow CDNs, but I don't want to be the person, that has to go to court for that. So I removed all CDNs from all my pages. I also know from companies, where they had to remove them.
So if it's possible, I would always advise people in Germany not to use a CDN and ship the files by their own service.
"Abmahnung"
We have a Dutch word for that as well, perhaps this is similar to a "Cease and Desist"?
Maybe. An "Abmahnung" is a letter from a lawyer, which is so friendly to inform you, that you are doing something wrong. You should sign a declaration, that you will not do this again in the future. And you should also pay the bill of the lawyer. This may not be a big problem for companies, but it is for people and small charity organizations. A lot of them are doing what the laws say and are paying the bill. So there are lawyers that have built their business model on this system. We even have a word for that "Abmahnindustrie" maybe ChatGPT can explain it better
Speaking of CDNs
https://httptoolkit.com/blog/public-cdn-risks/
Have we seen this before on linux x86? Just doing a fresh install of my server and building from source
= note: /usr/bin/ld: cannot find -lz: No such file or directory
/usr/bin/ld: cannot find -lzstd: No such file or directory
collect2: error: ld returned 1 exit status
(deleted)
@Anton
$ sudo apt install libz-dev libzstd-dev
Seems to have worked ok and given me the correct dependencies... is this worth adding to the getting started guide for linux?
The getting started page is targeted towards Roc users using the nightly releases, I would add it to BUILDING_FROM_SOURCE.md
Not really something to open a new thread about, but I thought I would share that I spent some time today updating my https://github.com/lukewilliamboswell/roc-htmx-tailwindcss-demo/tree/main demo -- it's now working again.
I encountered a number of compiler bugs along the way which made the process challenging.
One example is that I've stripped out the Session management as that was giving me grief and responsible for an error in alias analysis bug somewhere.
Also, this bug currently prevents Isaac's RTL from finishing which needs to be manually worked around.
I don't have any specific issue here to resolve or anything, but more a general observation that the experience wasn't great with our current state.
I like coming back to this demo and using it to help find bugs and test changes.
relatedly, I have a mostly working (with some gaps; it's not all the way there yet) implementation of Ayaz's standalone monomorphization-of-types pass, which is part 1 of the new lambda sets overhaul
gives a pretty amusing overview of the magnitude of difference between proof-of-concept OCaml and production Rust implementation :big_smile:
(most of the difference is just in the proof-of-concept type system being intentionally way simpler)
also I'm trying out a new "never panic" strategy for dealing with errors: everything takes a problems: &mut Vec<Problem>
and if something happens that's definitely a compiler bug, we just push into there and continue as best we can
if all the passes end up doing that strategy, then we won't have any compiler panics, and even if there are bugs, you'll still get some partial (and hopefully helpful enough to unblock you) errors because we won't have panicked prior to the reporting step
Will also make tracking them down and fixing much easier I imagine
maybe haha...right now they aren't including much context, but in the future we could throw a backtrace in there or something
mainly I'm just trying to make it so they don't panic, but do record where the problem is so we have some way of handling it better down the line
I guess I'm thinking we wont be throwing away the previous problems we've collected which may be useful for the user to fix also
Luke Boswell said:
Not really something to open a new thread about, but I thought I would share that I spent some time today updating my https://github.com/lukewilliamboswell/roc-htmx-tailwindcss-demo/tree/main demo -- it's now working again.
It's a cool demo, I've been trying to find a way into Roc and I was taking a look at it last night, even though it was in a kind of broken state it was fun to play around with.
Glad you enjoyed looking at it. There's a bunch of things in there now I'm looking at it I'd like to improve. When I last really touched it, was before module params... so I think there is plenty of space to explore with refactoring to use those better.
I've also used it to find API designs for basic-webserver... it's good to have a larger app to help find which helpers or types we might want to include in the platform.
I'll have to look into module params. :note:
Question, is there a way to get the LSP to show autocomplete suggestions for package imports? or does that require changes to the LSP
I'm not sure, I know we've discussed in zulip at some point. @Eli Dowling would probably know I think.
It may be something specific to an editor though. I'm not familiar with the LSP api myself
Convert a type-checked canonical IR to a monomorphized IR by creating specializations of all functions, such that they are monomorphic in types
Just to check my understanding. After this we have no type variables, the functions are duplicated (specialized) for each possible combination of types that is used.
correct
Just trying to follow this from a high level --- I can see we have the Subs types in crates/compiler/types/src/subs.rs
, does the Mono IR (monomorphized IR) referred to here also use these same types?
Screenshot 2024-09-29 at 13.00.11.png
I'm guessing mono is in crates/compiler/mono/src/ir.rs
In https://github.com/roc-lang/roc/pull/7130 snobee asks for clarification
I'm making some assumptions here that I wanted to confirm:
1. All definitions begin at the start of a line
2. The left side of a type annotation is the exact same as the left side of a definition
This sounds correct to me. Just wanted to ask here too.
I believe that's correct
Luke Boswell said:
Just trying to follow this from a high level --- I can see we have the Subs types in
crates/compiler/types/src/subs.rs
, does the Mono IR (monomorphized IR) referred to here also use these same types?
No, the type representation is different between mono and the earlier stages
Luke Boswell said:
I'm not sure, I know we've discussed in zulip at some point. Eli Dowling would probably know I think.
It may be something specific to an editor though. I'm not familiar with the LSP api myself
That does indeed require changes to the lsp. It's been a while since I looked at it, but from my memory the main issue was that before module Params and the changes to roc project structure that came with it, there wasn't a good way by default to find which file was the "root/module entry" of a project which made it mostly impossible to resolve what packages should be available in a particular file.
But I believe that should be sorted now, so it's definitely something that could be implemented!
There is kind of a uniform path for all the data to get into the lang-server from the compiler, and one section where we look for all the possible completions in the current scope.
Again, just a guess, but adding it would be a matter of:
If you wanted to take a look, my suggestion is follow how we get in file completion vs imported completions.
The main process when doing the actual completion is:
Figure out we are inside a package import using the ast (similar to how we find what completion are available at the current scope.
Perform a lookup that's similar to the current imported completions, but instead search all packages instead of ones we've already imported.
@Nathan Kramer if you wanted to take a crack at it, I'd be very happy to advise. I've been meaning to write some better docs for the lsp code anyway :)
Eli Dowling said:
@Nathan Kramer if you wanted to take a crack at it, I'd be very happy to advise. I've been meaning to write some better docs for the lsp code anyway :)
I think it's a bit outside the scope of my competence... I don't know rust :laughing:
Appreciate you taking the time to write out these thoughts, maybe I can work my way up to something like that!
Hahah, I see. Yeah, learning rust is certainly a bit of a hurdle to get over. Obviously I misunderstood :sweat_smile:
I go to clean up 1 thing... and end up cleaning up 5 others... and I still haven't yet got back to the thing I started with. :sweat_smile:
I've implemented a few fixes for basic-webserver... and then to test it I thought I'd use the htmx example, and next thing I know I'm updating roc-ansi and random other things.
It takes some discipline to not clean up 40 things at once, but it's not like any of us are complaining when you do it!
@Brendan Hansknecht -- I'm made that change to basic-webserver using a tag union for the method. I'm getting a segfault though.
I'm thinking I should we might be able to make some unit tests to be sure it's working correctly, but also help isolate the bug. I'm going to try that.
Here's the glue module I've got so far https://gist.github.com/lukewilliamboswell/e13968df1b305288555cbdadf84a9078
Lol, immediately after I wrote this, I think I realised a way we could dramatically simplify the glue. Move the RocStr value up into the RequestToAndFromHost
struct. It'll be empty most of the time but that ok.
Is there some way we can have an Option<xxx>
in a Roc record so we're not allocating an empty string?
Empty strings don't allocate
So should be fine to just leave the string in
Hmm. the problem though with this plan, is that then we introduce a deep copy for every request&response pair becuase now the platform will translate between the host version and what the app sees
I think we might be doing that everytime anyway
Yeah, I'm ditching this plan. It's just a bit too fragile and I want to move on.
Is there anyone online who has an Intel mac and would be free to help me with something?
Juts going back and forwards on this PR https://github.com/lukewilliamboswell/roc-platform-template-zig/pull/3 with CI, I think I've found a fix, but then every time I have to wait for Nix to do it's thing which takes a while.
I could probably make my life easier if I just used a nightly... but where's the fun in that
All good now... figured it out
Luke Boswell said:
Is there anyone online who has an Intel mac and would be free to help me with something?
I think @Folkert de Vries has one if I remember right! :big_smile:
oh nm glad it worked out!
Yeah, I figured out how to make a nice GH matrix runner that tests the platform against most of the roc supported os/archs
Between that and nix, I'm pretty happy with the coverage
Luke Boswell said:
Have we seen this before on linux x86? Just doing a fresh install of my server and building from source
= note: /usr/bin/ld: cannot find -lz: No such file or directory /usr/bin/ld: cannot find -lzstd: No such file or directory collect2: error: ld returned 1 exit status
I'm seeing a similar error on MacOS (with M1 Pro).
= note: ld: library not found for -lzstd
clang: error: linker command failed with exit code 1 (use -v to see invocation)
error: could not compile `roc_cli` (bin "roc") due to 1 previous error
I have zstd installed, and have even tried adding it's path to LDFLAGS, no dice. Anyone seen that before?
And I'm sorry, but I don't have the time to grok everything nix wants to do to my system, so I don't plan on installing it
I have zstd installed, and have even tried adding it's path to LDFLAGS, no dice. Anyone seen that before?
Can you tell me the path where zstd is installed?
but I don't have the time to grok everything nix wants to do to my system
I've never experienced any problems after installing nix but I understand the desire to know everything it does.
It's /opt/homebrew/opt/zstd
which is a symlink to /opt/homebrew/Cellar/zstd/1.5.6
Anton said:
but I don't have the time to grok everything nix wants to do to my system
I've never experience any problems after installing nix but I understand the desire to know everything it does.
Yeah, just going through the installer it talks about adding users and groups and a top level directory and more. I just need to spend some time eventually to get it. I know some people I trust use it and love it.
I've tried
export LDFLAGS="-L/usr/local/opt/llvm/lib,-L/opt/homebrew/opt,-rpath,/usr/local/opt/llvm/lib"
export LDFLAGS="-L/usr/local/opt/llvm/lib,-L/opt/homebrew/Cellar/zstd/1.5.6,-rpath,/usr/local/opt/llvm/lib"
And then both of those with -Wl put in various places (which yes, means I don't understand ld
very well even though I read the entire man page).
Also tried paring down LDFLAGS to just -L/opt/homebrew/opt
OR -L/opt/homebrew/Cellar/zstd/1.5.6
and no dice
Running find / -name zstd
I see that I have it also in a few Go packages, in some Cargo crates, and in Go itself
I'll check and compare on my mac in a bit
Hey @Anton I figured it out. Did some searching on here and found another thread from @Luke Boswell about this. I needed:
export LIBRARY_PATH="/opt/homebrew/Cellar/zstd/1.5.6/lib"
Had nothing to do with LDFLAGS. Don't worry, that was only an hour of my life wasted :cry:
Maybe we should add this to the BUILDING_FROM_SOURCE.md file?
It was right in the github workflow for macos apple silicon.... :man_facepalming:
I thought we removed the zstd dependency :thinking:
maybe that was something else?
In good news, from my local repl:
» this_should_work = 1
1 : Num *
» holy_hell = "This worked"
"This worked" : Str
» repl_is_slow = True
True : [True]
» bad_tag = Yuck_This_Is_God_Awful
── SYNTAX PROBLEM ──────────────────────────────────────────────────────────────
Underscores are not allowed in tag or opaque ref names:
10│ bad_tag = Yuck_This_Is_God_Awful
^^^^^^^^^^^^^^^^^^^^^^
I recommend using PascalCase. It's the standard style in Roc code!
» bad_ref = @Yuck_This_Is_God_Awful
── SYNTAX PROBLEM ──────────────────────────────────────────────────────────────
Underscores are not allowed in tag or opaque ref names:
12│ bad_ref = @Yuck_This_Is_God_Awful
^^^^^^^^^^^^^^^^^^^^^^^
I recommend using PascalCase. It's the standard style in Roc code!
And
» how_about___this_kind_thing = 42
── NAMING PROBLEM ──────────────────────────────────────────────────────────────
I am trying to parse an identifier here:
14│ how_about___this_kind_thing = 42
^^^^^^^^^^^^^^^^^^^^^^^^^^^
While snake case is allowed here, only a single consecutive underscore
should be used.
I thought we removed the zstd dependency :thinking:
I think it was removed but we had to put it back
While waiting for some substantive review of my PR, I'm going to play with updating an example to use snake_case and build it
Sweet, after changing all references of fizzBuzz
to fizz_buzz
in examples/FizzBuzz
:
…/roc-examples/examples/FizzBuzz main ! 08:51
❯ ../../../roc/target/debug/roc build
Downloading https://github.com/roc-lang/basic-cli/releases/download/0.16.0/O00IPk-Krg_diNS2dVWlI0ZQP794Vctxzv0ha96mK0E.tar.br
into /Users/anthonybullard/.cache/roc/packages
[8.9 / 8.9 MB]
0 errors and 0 warnings found in 10613 ms
while successfully building:
main
…/roc-examples/examples/FizzBuzz main ! 08:52
❯ ./main
1,2,Fizz,4,Buzz,Fizz,7,8,Fizz,Buzz,11,Fizz,13,14,FizzBuzz,16,17,Fizz,19,Buzz,Fizz,22,23,Fizz,Buzz,26,Fizz,28,29,FizzBuzz,31,32,Fizz,34,Buzz,Fizz,37,38,Fizz,Buzz,41,Fizz,43,44,FizzBuzz,46,47,Fizz,49,Buzz,Fizz,52,53,Fizz,Buzz,56,Fizz,58,59,FizzBuzz,61,62,Fizz,64,Buzz,Fizz,67,68,Fizz,Buzz,71,Fizz,73,74,FizzBuzz,76,77,Fizz,79,Buzz,Fizz,82,83,Fizz,Buzz,86,Fizz,88,89,FizzBuzz,91,92,Fizz,94,Buzz,Fizz,97,98,Fizz,Buzz
Anton said:
I thought we removed the zstd dependency :thinking:
I think it was removed but we had to put it back
ah I see - llvm needs it
so I guess once we're building our own llvm from source, we can drop the dependency?
most likely
This hasn't gone as swimmingly since. It seems that there is a problem with the basic-cli platform latest where the Arg module is still using Str.split and that is not available in the latest version of the compiler
Just linking to my comments here: https://roc.zulipchat.com/#narrow/channel/302903-platform-development/topic/Platform.20.2F.20Language.20version.20compatibility/near/483949925
My Roc journey this month:
Updating the tutorial is really helping me understand the scope of recent changes :smile:
I'm sorry for the rabbit hole :rabbit: :hole:
I'm stomping through the tall grass before the schoolbus arrives next week!
How is December so soon?
Basic CLI plans to support sqlite eventually, right?
Just thinking about the bundle of sqlite improvements that got reverted from basic webserver.
I might try to add them as a standalone crate and module in basic CLI (like @Luke Boswell has been doing for other shared effects). Then I would work to add it back into basic webserver.
Sounds great :grinning_face_with_smiling_eyes:
What is the current reasoning about things like sqlite(shared libs) and platforms? As they make sense to be used across different platforms and it would be great if there was one great API for it instead of "learn the API for the same thing for each platform".
I've not followed much of the conversation for the last several months (due to becoming a dad! :tada: ) - so I wonder if the general conclusion was that platforms benefit from being batteries included instead of composing shared effectful libs.
So sqlite is always special. If you want it, you must embed it in the platform. There is no implementing sqlite in roc. With many other things, like any other database, you can make a roc library built over tcp.
Anyway, with basic cli and basic webserver, we are starting to do basic rust crate sharing
So many effects are implemented in rust crates shared between the two platforms.
Then a thin duplicate roc shell
We could write a library for sqld/libsql and use the http protocol, but as far as “platform level capability” composition I hope we come up with something better than just sharing implementations in the long run. I’d like to have a singular Roc interface to many of these that also could be shared (and maybe built upon by plain Roc packages)
I think the correct way to use sqlite is to embedded it and skip any protocols. This means the primitives must be in the platform.
They can still be shared and libraries can still build on top of them, but the primitives would be required in each platform that wants to support sqlite
The only other reasonable alternative is generic ffi (which roc could support, but is a much more complex topic)
Someone could also make something to use the http protocol for sqlite, but, at least for me, that defeats essentially the entire point of using sqlite.
Turso would disagree :wink:. They also have a binary protocol as well. You still have an extremely fast battle-tested database that is file based (and libsql has a bunch of improvements over mainline SQLite)
But I don’t want to argue about SQLite in specific. I worry that having singular monolithic platforms instead of a runtime of effects that can be composed will, at scale, cause issues with adoption and many many speed bumps to a lot of reasonable use cases
SQLite is an outlier database because it's the only one that's normally bundled as a dylib
Postgres, MySQL, etc operate over a socket by default and so there's no problem making platform-agnostic pure-Roc packages to talk to them, such as roc-pg
that's why these conversations always come up in the context of SQLite specifically: it's the only one that makes this tricky :big_smile:
Yeah, I think SQlite is specifically a bad example
There are lots of other tools and effects to worry about.
That said, I think sharing via rust crates is totally reasonable
Slightly tangential to this conversation, but tree-sitter might be a good example of a tool that's written in order to make it easy to use from different programming languages.
Most of it is written in C so any language with C-ffi can pull it in. I've been integrating tree-sittter into a Zig platform I'm working on, and also tree-sitter-highlight, a tool bundled with tree-sitter, written in Rust, with hooks for calling it from C.
Another thought is that it might also be interesting to watch how different platforms integrate sqlite for a while before we start investing in tooling to make it easier. For my Zig platform I needed a library for generating HTML. Hannes wrote one that I could have used, roc-html, but I realized that for my platform an ideal Html type would internally be represented like this:
Xml : List
[
FromSource { start: U32, end: U32 },
RocGenerated (List U8),
]
I.e., a list of slices, some copied from a source file, others generated using functions like Html.div
. This is super specific to my platform, so no generic HTML library will define the HTML type like this. There might still be a way to do code sharing here, possibly a HTLM library might define all the helper functions for constructing different node types, but require you to pass in a custom internal XML
type and node
function.
I'm curious if we'll see similar situations with regards to sqlite, or tree-sitter, or other libraries designed for performance and integration via C-ffi. Platform authors might find that they can integrate the C dependency in a way specific to their platform that squeezes out some extra performance. It's hard to know upfront, given the platform concept is such a new paradigm.
I remember an idea discussed here in the likes of "libraries that require the platform to expose some types that implement abilities x y z".
That way a general html library might work with a generic Html msg
type while multiple platforms might pick up that type and render it in different ways (e.g. clay is an immediate mode rendering library that was just released that supports rendering as html, canvas and opengl - maybe a game engine platform could use it to render some html content using the same API)
Yeah, that still can be the basis for a lot of sharing
Can someone find the zulip discussion for ??
(providing defaults for results) as in #7089?
It's a couple different discussions. I'm on it
First mention by Richard: #ideas > `try-else` error context adding syntax @ 💬
Okay, not that many discussions
That's basically the last one that isn't just giving usage examples
@Anton did I ship something that's not meeting your expectations?
No, that's not it, I was not immediately sold on the concept of the ??
operator and wanted to look at the discussion that lead to it.
Yeah, I think it was suggested and no one really had an issue with it. So it was added (or planned for addition)
@Anton If you aren't a fan, I'd like to hear why and have a discussion about it before I implement the ?
binop
I was doubting if Result.withDefault
is used so often that there should be an operator for it. I now see that ??
will also be used for default value record fields, so it seems justified :+1:
I think Result.withDefault
will be used a lot for things like getting data out of a list, set, or map.
Pretty much like an Option.withDefault
if we had Options
That's how I found myself using it in AOC at least
like getting data out of a list, set, or map.
Would using Result.withDefault
in these cases not be a point of silent failure?
It all depends on the use case
But yes many times it would be.
I think the main cases will be for those "scripty" sort of workflows
Not 100% sure where this should go, so just asking it here:
How does one transfer a PGP key between computers?
you can use gpg --export-secret-key
This what I use for two linux machines:
# on old pc:
gpg --list-keys --keyid-format SHORT | grep ^pub
gpg --export --armor [Your_Key_ID] > public.key
# on new pc
gpg --import public.key
I'm going to be hacking on the type checker for the next few hours. If anyone would like to join you're most welcome.
I'm still at the "how does this thing work" phase of things... so it's pretty slow progress
at the most basic level, I like to think of Subs
as a "type database" (the name of that data structure could probably be better :sweat_smile:)
Variable
is like a TypeId
- the key in the key/value store of the types database
and then the value in that key/value store is either a specific type like "this is a function with these arguments and this return value..." or else a "symlink" to another TypeId - like "see this other thing for what my type is"
a really critical performance aspect of our type unification algorithm is that you're allowed to "compact" chains of symlinks at any point in the process without there being any chance of that changing the answer at the end of type-checking
in other words, if the type of A is "symlinked" to be whatever the type of B is, and B is "symlinked" to be whatever the type of C is, then we can compact that to have A be symlinked directly to C
which means that later whenever we're looking up the final type of A (which can come up a lot during type inference, e.g. to check if something has a type mismatch with A), we don't have to go through B, we can go straight to C
which means the symlink chains never get longer than 1 hop, instead of potentially growing to N hops - which could really get out of hand if it were allowed to happen :big_smile:
get_content
refers to "go get the actual type associated with this Variable
" and get_content_without_compacting
is the same thing except it takes a &Subs
instead of &mut Subs
and therefore won't (and can't, because it's not mut
) compact any symlink chains it encounters
By "symlinked" you're talking about substitutions?
Yeah I've been trying to learn the theory basics and map that back to what we have in the rust impl. It's a challenge to unpick and see how it all works together. Just poking at the different data structures mostly and translating them mostly 1-1 to zig for now.
Luke Boswell said:
By "symlinked" you're talking about substitutions?
yeah, I always thought of them as symlinks :big_smile:
but that's why it's called Subs
- short for "substitutions table"
Where do folks "join" for co-working or pairing or mobbing these days? All Zed? Sometimes Discord? I'm curious to try streaming some workdays, but I haven't felt motivated to set up OBS or perf check my hardware.
Luke and I mainly do Google Meet, but sometimes Zed
Discord is great, but lots of people don't really use it still
I posted earlier when I was starting. I'm almost done for todays session. My plan is to let people know in here if I know I've got a few hours to hack on things.
Oh, you're still going?
Good to know
Ah, hopefully, I can join the next time. It's been a time since I last looked into roc’s guts
Hello, I would like to ask if the rust part of roc is still being updated? I want to learn about compiled languages through roc, and I might want to start with the rust part, but I see that roc seems to have been rewritten with zig?
We have recently started a rewrite in zig. The rust compiler is still there and operational... all the recent effort to add new features is going into the zig compiler.
What are you hoping to learn about? You can definitely read through the rust compiler code, though there is a lot there.
OK, thank you very much for your quick answer. I am very interested in the world of compiled languages and I hope to learn some knowledge about compiled languages from roc. I will try to read roc's rust code and try to make some contributions.
Are we currently accepting contributions for the rust part, or would it be more appropriate to start with zig?
I would point you to the https://www.roc-lang.org/community#ideas and say it's a good idea to discuss anything in here before you commit too much time on something you would like to merge.
I would suggest Zig is the best place, though there isn't much actually there to work with at the moment. We're trying to put the bare bones together and get a really simple compiler end to end -- which might take us a little while.
I'll be jumping on either a Zed call or Google Meet in about an hour and a half to work on implementing the basic form of coordinate.zig
if anyone would like to join!
I'm in this call: https://meet.google.com/vse-gvzr-sqv
Is there an existing #roc channel for Zed, or should I create a new one?
I thought we said one already existed, but not sure
We jumped off for now, might jump back on later but not sure
Yeah I'm just grabbing a bite to eat
Heads up, if anyone is ever curious about any of the plans for how a part of the new Zig-based compiler will work, I'm happy to chat. Please feel free to ask questions here or in a direct message :smile:
For anyone interested, myself and a few other roc contributors will be meeting up at . Just an ad hoc update and general discussion. Will be super laid back. I guess we'll try jitsi for this one
Starting now for anyone interested :point_up:
Here are screenshots showing a pathological case of my error message UX pain, where I'm deleting one line of code:
Screenshot_20250322_122439.png
Screenshot_20250322_122512.png
btw know ive just been gone for the past little bit (got myself into a bit of a journey with Racket, but what's the status of the Zig rewrite so far (is the zig src folder readme state accurate, and why are theyre still issues for the rust version)?
I think we basically have a full tokenizer and parser, but later stages just have shims or limited work. I know progress has been made for canonicalization and unification.
We have done no work to clean up rust issues and we still log some rust bugs. Not that they are planned to be fixed. Just tracked.
Oh, we also have made a decent bit of progress for fuzzing, snapshotting, and testing, to give a more robust base
I think overall progress will ebb and flow in waves
I've been trying to add some snapshoting (sexpr) support to canonicalize, but I've banging my head against a compiler error for a while. Would appreciate a second pair of eyes if one is available.
My current code is here, and on that line specifically I'm seeing the following error:
src/check/canonicalize/IR.zig:286:42: error: no field or member function named 'toSExpr' in 'collections.safe_list.SafeList(base.ModuleImport).Idx'
var elem_sexpr = elem.toSExpr(env, ir);
~~~~^~~~~~~~
src/collections/safe_list.zig:31:25: note: enum declared here
pub const Idx = enum(u32) { _ };
^~~~~~~~~~~~~~~
Similar errors involving SafeList(base.ModuleImport).Idx
also pop up when trying other variations of this loop.
AFAICT this code very explicitly is not doing anything involving ModuleImport's - so I'm very confused.
There are similar errors that pop up in some similar circumstances later in the file, clearly also mismatching in type arguments.
At this point I suspect there is a zig compiler bug of some sort, if nothing else in deciding which types to show to the user when rendering the error message.
Thoughts/tips?
ok i know the idea for "type functions as comptime" in Roc was too compicated, but i just had a thought. Extend the Racket "seperate compilation guarentee" to any sort of logic beyond pattern matching. That is, a "type function" (generic type) could only act as a simple, pure, lambda from one term (that is a type) to another
Probably worth spinning up a #ideas thread with more concrete details.
Joshua Warner said:
I've been trying to add some snapshoting (sexpr) support to canonicalize, but I've banging my head against a compiler error for a while. Would appreciate a second pair of eyes if one is available.
My current code is here, and on that line specifically I'm seeing the following error:
src/check/canonicalize/IR.zig:286:42: error: no field or member function named 'toSExpr' in 'collections.safe_list.SafeList(base.ModuleImport).Idx' var elem_sexpr = elem.toSExpr(env, ir); ~~~~^~~~~~~~ src/collections/safe_list.zig:31:25: note: enum declared here pub const Idx = enum(u32) { _ }; ^~~~~~~~~~~~~~~
Similar errors involving
SafeList(base.ModuleImport).Idx
also pop up when trying other variations of this loop.AFAICT this code very explicitly is not doing anything involving ModuleImport's - so I'm very confused.
There are similar errors that pop up in some similar circumstances later in the file, clearly also mismatching in type arguments.
At this point I suspect there is a zig compiler bug of some sort, if nothing else in deciding which types to show to the user when rendering the error message.
Thoughts/tips?
Not immediately, but I probably can take a look tomorrow night or Tuesday.
I think you need to fetch the child expressions and call toSExpr on them @Joshua Warner
Right, I was thinking roughly along those lines, but I haven't happened upon the right pattern to use with SafeMultiList. Applying this diff:
diff --git a/src/check/canonicalize/IR.zig b/src/check/canonicalize/IR.zig
index 3b4858fe9c..25db1cc804 100644
--- a/src/check/canonicalize/IR.zig
+++ b/src/check/canonicalize/IR.zig
@@ -283,7 +283,8 @@ pub const Expr = union(enum) {
appendTypeVarChild(&node, gpa, "elem_var", l.elem_var);
var elems_node = sexpr.Expr.init(gpa, "elems");
for (l.elems.items(.expr)) |elem| {
- var elem_sexpr = elem.toSExpr(env, ir);
+ const e = ir.exprs_at_regions.get(elem);
+ var elem_sexpr = e.toSExpr(env, ir);
elems_node.appendNodeChild(gpa, &elem_sexpr);
}
node.appendNodeChild(gpa, &elems_node);
Results in a very similar error:
src/check/canonicalize/IR.zig:286:55: error: expected type 'collections.safe_list.SafeMultiList(check.canonicalize.IR.When).Idx', found 'collections.safe_list.SafeList(base.ModuleImport).Idx'
const e = ir.exprs_at_regions.get(elem);
^~~~
Ok yeah, found the issue. Zig is indeed giving error messages that involve incorrect types vs what it's actually using internally.
Hey is there a place for open needed contributions for the zig rewrite so I don’t step on anyone’s toes by doing something they’re already doing?
Feel free to take anything without an emoji from here. @Agus Zubiaga was really busy with work so perhaps you can try finishing https://github.com/roc-lang/roc/pull/7634 ?
Sure, so long as he dosent mind I’ll copy the pr and start work on it
i dont know how i would get by without git guis, ive started work on the pr
are certain of the values in https://github.com/roc-lang/roc/blob/main/src/base/module_work.zig not supposed to be generic with Work and instead specific with can.IR?
Should be generic if it’s in base
is it even used?
also theres an error in the good for it (unless im just missing something here)
src/base/module_work.zig:92:43: error: expected type 'mem.Allocator', found pointer
.work = Work.init(&can_irs.getWork(work_idx).env),
Zig won't compiler or test anything it doesn't have to
It will tree shake to remove code before compiling
So unused code (like that written during the build out process) can easily become stale
ok, yeah zig's compiler is too smart :)
Compiles faster, but means a lot of stuff is totally unchecked
Finally have my Mac Mini on Linux (Fedora 41 w/ Gnome), and it is ready for Roc development!
This is a machine that I'm much more likely to try out Nix on
this was obviously unnecessary, but I thought it was fun: instead of looking up whether Zig's @max
and @min
are branchless (I assumed so, but wanted to double-check) I asked Claude to prove one way or the other by writing a small zig program, building it, disassembling the output, checking the assembly instructions, and then quoting relevant portions of the asm to me to prove whether they were branchless
And it explains how conditional select differs from a jump?
I mean a cmp then a jmp
Cool use, though still wouldn't truly prove anything. Zig could still be building the branches and llvm could just be removing them (not the case here, but difference between guarantees and happenstance).
Just thinking about the complex heuristics in compilers and what can accidentally be assumed
But really is just a brittle pattern
oh sure, but the thing I care about is whether there's branching in the output asm
so in that sense, it's testing exactly what I want it to test! :smiley:
and then I asked it to do the same thing on a function I wrote, and it confirmed that it was not branchless (as I had suspected) - that is, in this case the if
s were not getting compiled to csel
s
basically a quicker way to verify that than my usual standby of "copy/paste as much as I need to into godbolt.org and scan the assembly output myself"
That's fair
Just make sure it runs with optimizations
Anthony Bullard said:
Finally have my Mac Mini on Linux (Fedora 41 w/ Gnome), and it is ready for Roc development!
This is a kind of setup that I have not tested before, let me know if you hit any issues! We do test aarch64 linux but macs can be special :big_smile:
Oh to be clear this is a 2018 Intel Mac Mini
And it runs like a dream! The only difficulty I had was figuring out how to get the WiFi card to work - just had to find the proprietary firmware.
I use a hp laptop that runs an Ubuntu 22.04 GNU-Linux OS...
I hope this works for Roc development? :grinning_face_with_smiling_eyes:
That should work, but feel free to ask if you hit any problems
@Magdiel Amor Roc is delightful on Debian 12 on my Framework laptop, and Ubuntu 22.04 is based on Debian 12, so you should have a relatively smooth developer experience!
Sorry for the slight change in plans, but I’m going to need to take a two week hiatus from Roc development work. My workload has went from busy to insane in the past two weeks and going to ludicrous through the first week of May.
I will try my best to be at the contributor meeting, but beyond that I’ll need to use my Roc time for work.
no worries…I’m in a similar boat. Biggest launch in Zed’s history is like 3 weeks away (we have a hard deadline of launching before the RustWeek conference) and some of us are already making weekend PRs on it. :sweat_smile:
🤞🏻 please be the Windows version, please be the Windows version 🤞🏻:grinning_face_with_smiling_eyes:
Hey everyone, this week I started streaming myself working. :partying_face: Right now I'm resuming (and will continue this week during North American office hours) script writing for my upcoming Roc syntax tutorial videos, and next week I hope to resume developing my Roc/Rust-based music synthesizer. Join me any time! https://www.twitch.tv/kili_ilo
We did not have a test for Tty.enable_raw_mode!({})
yet in basic-cli, so I made an example that uses it: snake in the terminal :snake:
screenshot_snake.png
Source code here
My First Impressions of Gleam · mtlynch.io.jpeg.png
pretty wild how all of these except the last one are now things 0.1.0 will have...should be a much smoother learning curve for beginners! :smiley:
from https://mtlynch.io/notes/gleam-first-impressions/
I've thought about list accessors btw...would be pretty straightforward, just a[b]
desugars to a.get(b)
not sure it's worth having in the language, but the design would be pretty obvious given how a + b
will desugar to a.plus(b)
etc.
My First Impressions of Gleam · mtlynch.io.jpeg.png
ha! this one too
My First Impressions of Gleam · mtlynch.io.jpeg.png
and this :joy:
Richard Feldman said:
I've thought about list accessors btw...would be pretty straightforward, just
a[b]
desugars toa.get(b)
Worth a serious consideration. A lot of people probably would appreciate it.
The one benefit I see to .get
over accessors is that it is more obvious it could return a result. For some reason (probably just used to it), I expect brackets to not return a result, but a value or panic.
I could probably get used to the (arguably better) semantics of brackets also returning a result
Also thinking about tuple accessors, t[0]
is probably nicer than t.0
Going back to a convo about re-exporting pub types and when we used to need to put comments on all pub types: TIL that you can do:
pub usingnamespace @import("./types/types.zig");
To re-export all of a sub-modules types!
Luke Boswell said:
Also thinking about tuple accessors,
t[0]
is probably nicer thant.0
it's not the same bc you can put variables in there - e.g. tuple[x]
a downside of brackets for familiarity is that although x = list[y]
can work about the way you'd expect (give or take the Result
), but list[x] = y
wouldn't work at all for multiple reasons
Luke Boswell said:
Also thinking about tuple accessors,
t[0]
is probably nicer thant.0
Hmmm....I think that has a problem of being a different signature. t.0
would be compile time known and guaranteed in bounds. t[0]
would also allow for t[x]
and might be out of bounds. That said, I think we should support that for homogeneous tuples (essentially just treat them as array types).
We could allow list_[x] = y
to work though. Just require reassignment.
For Records... I've been pulling together a number of snapshots to exercise different use cases. I think I should introduce those in a separate PR even though they aren't implemented yet.
I'd like someone (maybe @Anthony Bullard or @Richard Feldman ) to review and confirm I've got the syntax and describe behaviour correctly before working on "fixing" anything that isn't working in our current implementation.
Hopefully I'll have something to share soon.
sounds good!
i'm currently refactoring a bunch of stuff having to do with numbers in Can
Are you working from main? Richard just merged his PR
yeah @Anthony Bullard make sure to take a look at main
- a bunch of canonical number changes just landed!
oh shoot! i had no idea you were working on that
OK, any change you might be able to look at these snapshot examples https://github.com/roc-lang/roc/pull/7874
yes in a bit
my number changes might be able to be reused in a later stage but i doubt it
i approved but you have a lot of spelling mistakes
fields is uniformly spelled fileds
This is what happens when your fingers don't obey your brain
Should the Ident interner be de-duplicating strings? so if I "insert" two identical strings name
and name
I get the same Idx back?
In Can there a quite a few places where we are comparing "strings", extracting their idents from the interner by idx then comparing bytes. I thought the idea was we intern them (paying the price once) and then we can use thier Idx's to compare very cheaply
Update on this for anyone interested.
The problem is that each occurrence of an identifier gets its own
Ident.Idx
value, even if it has the same text content. The interning system deduplicates the string storage, but still creates separateIdent.Idx
values for each occurrence because they have different source locations.
So when we're doing scope lookups, we need to compare by text content (using
identsHaveSameText
) rather than by exactIdent.Idx
equality, because the same variable name used in different places in the code will have differentIdent.Idx
values.
Note. We should still never need to compare strings
Once you know the offset into the underlying byte storage you know if they match or not
So it should just be a pointer/offset comparison and not a full string comparison
Yeah the helper is called identsHaveSameText
and it does that for us.
i think Martin Odersky has been following you @Richard Feldman
image_3B1C51AF-9251-49A2-AD1C-FE077EFB95C1_1750937783.png
Last updated: Jul 05 2025 at 12:14 UTC