1. 14 Nov, 2020 12 commits
    • Eric Myhre's avatar
      Token.Normalize utility method. · a8995f6f
      Eric Myhre authored
      Useful for tests that do deep equality tests on structures.
      
      Same caveat about current placement of this method as in the previous
      commit: this might be worth detaching and shifting to a 'codectest'
      or 'tokentest' package.  But let's see how it shakes out.
      a8995f6f
    • Eric Myhre's avatar
      Extract and export StringifyTokenSequence utility. · d3511334
      Eric Myhre authored
      This is far too useful in testing to reproduce in each package that
      needs something like it.  It's already shown up as desirable again
      as soon as I start implementing even a little bit of even one codec
      tokenizer, and that's gonna keep happening.
      
      This might be worth moving to some kind of a 'tokentest' or
      'codectest' package instead of cluttering up this one, but...
      we'll see; I've got a fair amount more code to flush into commits,
      and after that we can reshake things and see if packages settle
      out differently.
      d3511334
    • Eric Myhre's avatar
      Add budget parameter to TokenReader. · 33fb7d98
      Eric Myhre authored
      There were already comments about how this would be "probably"
      necessary; I don't know why I wavered, it certainly is.
      33fb7d98
    • Eric Myhre's avatar
      Type the TokenKind consts correctly. · 72793f26
      Eric Myhre authored
      You can write a surprising amount of code where the compiler will shrug
      and silently coerce things for you.  Right up until you can't.
      (Some test cases that'll be coming down the commit queue shortly
      happened to end up checking the type of the constants, and, well.
      Suddenly this was noticable.)
      72793f26
    • Eric Myhre's avatar
      Drop earlier design comments. · 2143068c
      Eric Myhre authored
      We definitely did make a TokenWalker, heh.
      
      The other naming marsh (heh, see what I did there?) is still unresolved
      but can stay unresolved a while longer.
      2143068c
    • Eric Myhre's avatar
      Fresh take on codec APIs, and some tokenization utilities. · 1da7e2dd
      Eric Myhre authored
      The tokenization system may look familiar to refmt's tokens -- and
      indeed it surely is inspired by and in the same pattern -- but it
      hews a fair bit closer to the IPLD Data Model definitions of kinds,
      and it also includes links as a token kind.  Presense of link as
      a token kind means if we build codecs around these, the handling
      of links will be better and most consistently abstracted (the
      current dagjson and dagcbor implementations are instructive for what
      an odd mess it is when you have most of the tokenization happen
      before you get to the level that figures out links; I think we can
      improve on that code greatly by moving the barriers around a bit).
      
      I made both all-at-once and pumpable versions of both the token
      producers and the token consumers.  Each are useful in different
      scenarios.  The pumpable versions are probably generally a bit slower,
      but they're also more composable.  (The all-at-once versions can't
      be glued to each other; only to pumpable versions.)
      
      Some new and much reduced contracts for codecs are added,
      but not yet implemented by anything in this comment.
      The comments on them are lengthy and detail the ways I'm thinking
      that codecs should be (re)implemented in the future to maximize
      usability and performance and also allow some configurability.
      (The current interfaces "work", but irritate me a great deal every
      time I use them; to be honest, I just plain guessed badly at what
      the API here should be the first time I did it.  Configurability
      should be both easy to *not* engage in, but also easier if you do
      (and in pariticular, not require reaching to *another* library's
      packages to do it!).)  More work will be required to bring this
      to fruition.
      
      It may be particularly interesting to notice that the tokenization
      systems also allow complex keys -- maps and lists can show up as the
      keys to maps!  This is something not allowed by the data model (and
      for dare I say obvious reasons)... but it's something that's possible
      at the schema layer (e.g. structs with representation strategies that
      make them representable as strings can be used as map keys), so,
      these functions support it.
      1da7e2dd
    • Eric Myhre's avatar
      Merge pull request #98 from ipld/adl-demo-rot13 · 35ad3e37
      Eric Myhre authored
      Add a demo ADL (rot13adl)
      35ad3e37
    • Eric Myhre's avatar
      rot13adl demo: example of creating data using synthetic view, then accessing substrate. · 65413830
      Eric Myhre authored
      Fixed a symbol to be exported that's needed for this to be possible
      when outside the package.  This still probably deserves an interface,
      too, though.  Comments on that also updated, but we'll still leave
      that for future work (more examples of more ADLs wanted before we try
      to solidify on something there).
      65413830
    • Eric Myhre's avatar
    • Eric Myhre's avatar
      rot13adl demo: finish documentation; simplify Reify; more recommendations... · 9d537f10
      Eric Myhre authored
      rot13adl demo: finish documentation; simplify Reify; more recommendations about how to implement Reify; consistent export symbol conventions; some fixes.
      9d537f10
    • Eric Myhre's avatar
      11223290
    • Eric Myhre's avatar
      Merge pull request #110 from ipld/traversal-select-links · 04afddfb
      Eric Myhre authored
      Introduce traversal function that selects links out of a tree.
      04afddfb
  2. 13 Nov, 2020 1 commit
  3. 02 Nov, 2020 1 commit
  4. 30 Oct, 2020 7 commits
  5. 21 Oct, 2020 5 commits
  6. 20 Oct, 2020 6 commits
    • Eric Myhre's avatar
      codegen: make error info available when tuples process data that is too long. · 2461bba8
      Eric Myhre authored
      This requires introducing an error-carrying NodeAssembler,
      because the AssembleValue methods don't have the ability to return errors themselves.
      
      AssembleValue methods have not needed to return errors before!
      Most lists don't have any reason to error: out of our whole system,
      it's only struct-with-tuple-representation that can have errors here,
      due to tuples having length limits.
      AssembleValue for maps doesn't have a similar challenge either,
      because key invalidity can always be indicated by errors returned from
      the key assembly process.
      
      I'm not a big fan of this diff -- error carrying thunks like this are
      ugly to write and they're also pretty ugly to use -- but I'm not sure
      what would be better.  ListAssembler.AssembleValue returning an error?
      Turning ListAssembler into a two phase thing, e.g. with an Advance
      method that fills some of the same role as AssembleKey does for maps,
      and gives us a place to return errors?
      2461bba8
    • Eric Myhre's avatar
      Merge-ignore branch 'stalled/dagjson-bytes'. · e2551099
      Eric Myhre authored
      I don't think we've resolved the questions already noted in that diff yet,
      and I abhor drifting open branches, so I'm going to "merge" this
      such that it's in git history for reference, but not an effected diff.
      
      Additionally, I believe we encountered a question about whether this string
      should include multibase -- this diff doesn't; the spec suggested it should;
      and I think we've agreed the spec should be changed, but I'm not sure if
      that's been done yet.
      
      At any rate, someone's welcome to take a look at this again in the future.
      e2551099
    • Eric Myhre's avatar
      Attempt to support dagjson spec for bytes; writer side. · d5b658c1
      Eric Myhre authored
      I haven't implemented the reader side because I'm not sure it's possible;
      the specification is insufficiently clear.
      I opened Issue https://github.com/ipld/specs/issues/302 to track this.
      d5b658c1
    • Eric Myhre's avatar
      Merge branch 'codegen-typofixes' · e53a5474
      Eric Myhre authored
      e53a5474
    • Eric Myhre's avatar
      329252db
    • Eric Myhre's avatar
      Merge pull request #85 from ipld/resource-budget-for-dagcbor-parser · 6428f6bb
      Eric Myhre authored
      Implement resource budgets in dagcbor parsing.
      6428f6bb
  7. 15 Oct, 2020 1 commit
  8. 04 Oct, 2020 1 commit
  9. 01 Oct, 2020 3 commits
  10. 28 Sep, 2020 1 commit
  11. 24 Sep, 2020 1 commit
  12. 10 Sep, 2020 1 commit
    • Daniel Martí's avatar
      schema/gen/go: make all top-level tests parallel · 35003242
      Daniel Martí authored
      They do quite a lot of work, including setting up a type system,
      generating Go code, and building it.
      
      This package is by far the slowest to test. On a warm cache, 'go test
      -count=1 ./...' shows all packages under 0.2s, while this one sits at
      ~1.5s.
      
      1.5s is not unreasonable with a warm cache, but we can bring that down
      to 0.6s on my quad-core laptop by just making all tests parallel. Note
      that sub-tests aren't parallel for now, since there are multiple layers
      of them and I'm not sure I even follow what's going on.
      
      Mac is also disabled for the time being, since it seems to time out too
      often.
      35003242