Skip to content

Commit 84422ff

Browse files
committed
Version 4.0 merged
1 parent c00546e commit 84422ff

File tree

296 files changed

+30332
-9664
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

296 files changed

+30332
-9664
lines changed

.github/workflows/ros2.yaml

+4-1
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,10 @@ jobs:
77
strategy:
88
matrix:
99
env:
10-
- {ROS_DISTRO: eloquent, ROS_REPO: main}
10+
- {ROS_DISTRO: foxy, ROS_REPO: main}
11+
- {ROS_DISTRO: galactic, ROS_REPO: main}
12+
- {ROS_DISTRO: humble, ROS_REPO: main}
13+
- {ROS_DISTRO: rolling, ROS_REPO: main}
1114
runs-on: ubuntu-latest
1215
steps:
1316
- uses: actions/checkout@v1

3rdparty/lexy/CMakeLists.txt

+11
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# Copyright (C) 2020-2022 Jonathan Müller and lexy contributors
2+
# SPDX-License-Identifier: BSL-1.0
3+
4+
cmake_minimum_required(VERSION 3.8)
5+
project(lexy VERSION 2022.05.1 LANGUAGES CXX)
6+
7+
set(LEXY_USER_CONFIG_HEADER "" CACHE FILEPATH "The user config header for lexy.")
8+
option(LEXY_FORCE_CPP17 "Whether or not lexy should use C++17 even if compiler supports C++20." OFF)
9+
10+
add_subdirectory(src)
11+

3rdparty/lexy/LICENSE

+23
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
Boost Software License - Version 1.0 - August 17th, 2003
2+
3+
Permission is hereby granted, free of charge, to any person or organization
4+
obtaining a copy of the software and accompanying documentation covered by
5+
this license (the "Software") to use, reproduce, display, distribute,
6+
execute, and transmit the Software, and to prepare derivative works of the
7+
Software, and to permit third-parties to whom the Software is furnished to
8+
do so, all subject to the following:
9+
10+
The copyright notices in the Software and this entire statement, including
11+
the above license grant, this restriction and the following disclaimer,
12+
must be included in all copies of the Software, in whole or in part, and
13+
all derivative works of the Software, unless such copies or derivative
14+
works are solely in the form of machine-executable object code generated by
15+
a source language processor.
16+
17+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
18+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
19+
FITNESS FOR A PARTICULAR PURPOSE, TITLE AND NON-INFRINGEMENT. IN NO EVENT
20+
SHALL THE COPYRIGHT HOLDERS OR ANYONE DISTRIBUTING THE SOFTWARE BE LIABLE
21+
FOR ANY DAMAGES OR OTHER LIABILITY, WHETHER IN CONTRACT, TORT OR OTHERWISE,
22+
ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
23+
DEALINGS IN THE SOFTWARE.

3rdparty/lexy/README.adoc

+174
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,174 @@
1+
= lexy
2+
3+
ifdef::env-github[]
4+
image:https://img.shields.io/endpoint?url=https%3A%2F%2Fwww.jonathanmueller.dev%2Fproject%2Flexy%2Findex.json[Project Status,link=https://www.jonathanmueller.dev/project/]
5+
image:https://github.com/foonathan/lexy/workflows/Main%20CI/badge.svg[Build Status]
6+
image:https://img.shields.io/badge/try_it_online-blue[Playground,link=https://lexy.foonathan.net/playground]
7+
endif::[]
8+
9+
lexy is a parser combinator library for {cpp}17 and onwards.
10+
It allows you to write a parser by specifying it in a convenient {cpp} DSL,
11+
which gives you all the flexibility and control of a handwritten parser without any of the manual work.
12+
13+
ifdef::env-github[]
14+
*Documentation*: https://lexy.foonathan.net/[lexy.foonathan.net]
15+
endif::[]
16+
17+
.IPv4 address parser
18+
--
19+
ifndef::env-github[]
20+
[.godbolt-example]
21+
.+++<a href="https://godbolt.org/z/scvajjE17", title="Try it online">{{< svg "icons/play.svg" >}}</a>+++
22+
endif::[]
23+
[source,cpp]
24+
----
25+
namespace dsl = lexy::dsl;
26+
27+
// Parse an IPv4 address into a `std::uint32_t`.
28+
struct ipv4_address
29+
{
30+
// What is being matched.
31+
static constexpr auto rule = []{
32+
// Match a sequence of (decimal) digits and convert it into a std::uint8_t.
33+
auto octet = dsl::integer<std::uint8_t>;
34+
35+
// Match four of them separated by periods.
36+
return dsl::times<4>(octet, dsl::sep(dsl::period)) + dsl::eof;
37+
}();
38+
39+
// How the matched output is being stored.
40+
static constexpr auto value
41+
= lexy::callback<std::uint32_t>([](std::uint8_t a, std::uint8_t b, std::uint8_t c, std::uint8_t d) {
42+
return (a << 24) | (b << 16) | (c << 8) | d;
43+
});
44+
};
45+
----
46+
--
47+
48+
== Features
49+
50+
Full control::
51+
* *Describe the parser, not some abstract grammar*:
52+
Unlike parser generators that use some table driven magic for parsing, lexy's grammar is just syntax sugar for a hand-written recursive descent parser.
53+
The parsing algorithm does exactly what you've instructed it to do -- no more ambiguities or weird shift/reduce errors!
54+
* *No implicit backtracking or lookahead*:
55+
It will only backtrack when you say it should, and only lookahead when and how far you want it.
56+
Don't worry about rules that have side-effects, they won't be executed unnecessarily thanks to the user-specified lookahead conditions.
57+
https://lexy.foonathan.net/playground?example=peek[Try it online].
58+
* *Escape hatch for manual parsing*:
59+
Sometimes you want to parse something that can't be expressed easily with lexy's facilities.
60+
Don't worry, you can integrate a hand-written parser into the grammar at any point.
61+
https://lexy.foonathan.net/playground/?example=scan[Try it online].
62+
* *Tracing*:
63+
Figure out why the grammar isn't working the way you want it to.
64+
https://lexy.foonathan.net/playground/?example=trace&mode=trace[Try it online].
65+
66+
Easily integrated::
67+
* *A pure {cpp} DSL*:
68+
No need to use an external grammar file; embed the grammar directly in your {cpp} project using operator overloading and functions.
69+
* *Bring your own data structures*:
70+
You can directly store results into your own types and have full control over all heap allocations.
71+
* *Fully `constexpr` parsing*:
72+
You want to parse a string literal at compile-time? You can do so.
73+
* *Minimal standard library dependencies*:
74+
The core parsing library only depends on fundamental headers such as `<type_traits>` or `<cstddef>`; no big includes like `<vector>` or `<algorithm>`.
75+
* *Header-only core library* (by necessity, not by choice -- it's `constexpr` after all).
76+
77+
ifdef::env-github[Designed for text::]
78+
ifndef::env-github[Designed for text (e.g. {{< github-example json >}}, {{< github-example xml >}}, {{< github-example email >}}) ::]
79+
* *Unicode support*: parse UTF-8, UTF-16, or UTF-32, and access the Unicode character database to query char classes or perform case folding.
80+
https://lexy.foonathan.net/playground?example=identifier-unicode[Try it online].
81+
* *Convenience*:
82+
Built-in rules for parsing nested structures, quotes and escape sequences.
83+
https://lexy.foonathan.net/playground?example=parenthesized[Try it online].
84+
* *Automatic whitespace skipping*:
85+
No need to manually handle whitespace or comments.
86+
https://lexy.foonathan.net/playground/?example=whitespace_comment[Try it online].
87+
88+
ifdef::env-github[Designed for programming languages::]
89+
ifndef::env-github[Designed for programming languages (e.g. {{< github-example calculator >}}, {{< github-example shell >}})::]
90+
* *Keyword and identifier parsing*:
91+
Reserve a set of keywords that won't be matched as regular identifiers.
92+
https://lexy.foonathan.net/playground/?example=reserved_identifier[Try it online].
93+
* *Operator parsing*:
94+
Parse unary/binary operators with different precedences and associativity, including chained comparisons `a < b < c`.
95+
https://lexy.foonathan.net/playground/?example=expr[Try it online].
96+
* *Automatic error recovery*:
97+
Log an error, recover, and continue parsing!
98+
https://lexy.foonathan.net/playground/?example=recover[Try it online].
99+
100+
ifdef::env-github[Designed for binary input::]
101+
ifndef::env-github[Designed for binary input (e.g. {{< github-example protobuf >}})::]
102+
* *Bytes*: Rules for parsing `N` bytes or Nbit big/little endian integer.
103+
* *Bits*: Rules for parsing individual bit patterns.
104+
* *Blobs*: Rules for parsing TLV formats.
105+
106+
== FAQ
107+
108+
Why should I use lexy over XYZ?::
109+
lexy is closest to other PEG parsers.
110+
However, they usually do more implicit backtracking, which can hurt performance and you need to be very careful with rules that have side-effects.
111+
This is not the case for lexy, where backtracking is controlled using branch conditions.
112+
lexy also gives you a lot of control over error reporting, supports error recovery, special support for operator precedence parsing, and other advanced features.
113+
114+
http://boost-spirit.com/home/[Boost.Spirit]:::
115+
The main difference: it is not a Boost library.
116+
Otherwise, it is just a different implementation with a different flavor.
117+
Use lexy if you like lexy more.
118+
https://github.com/taocpp/PEGTL[PEGTL]:::
119+
PEGTL is very similar and was a big inspiration.
120+
The biggest difference is that lexy uses an operator based DSL instead of inheriting from templated classes as PEGTL does;
121+
depending on your preference this can be an advantage or disadvantage.
122+
Hand-written Parsers:::
123+
Writing a handwritten parser is more manual work and error prone.
124+
lexy automates that away without having to sacrifice control.
125+
You can use it to quickly prototype a parser and then slowly replace more and more with a handwritten parser over time;
126+
mixing a hand-written parser and a lexy grammar works seamlessly.
127+
128+
How bad are the compilation times?::
129+
They're not as bad as you might expect (in debug mode, that is).
130+
+
131+
The example JSON parser compiles in about 2s on my machine.
132+
If we remove all the lexy specific parts and just benchmark the time it takes for the compiler to process the datastructure (and stdlib includes),
133+
that takes about 700ms.
134+
If we validate JSON only instead of parsing it, so remove the data structures and keep only the lexy specific parts, we're looking at about 840ms.
135+
+
136+
Keep in mind, that you can fully isolate lexy in a single translation unit that only needs to be touched when you change the parser.
137+
You can also split a lexy grammar into multiple translation units using the `dsl::subgrammar` rule.
138+
139+
How bad are the {cpp} error messages if you mess something up?::
140+
They're certainly worse than the error message lexy gives you.
141+
The big problem here is that the first line gives you the error, followed by dozens of template instantiations, which end at your `lexy::parse` call.
142+
Besides providing an external tool to filter those error messages, there is nothing I can do about that.
143+
144+
How fast is it?::
145+
Benchmarks are available in the `benchmarks/` directory.
146+
A sample result of the JSON validator benchmark which compares the example JSON parser with various other implementations is available https://lexy.foonathan.net/benchmark_json/[here].
147+
148+
Why is it called lexy?::
149+
I previously had a tokenizer library called foonathan/lex.
150+
I've tried adding a parser to it, but found that the line between pure tokenization and parsing has become increasingly blurred.
151+
lexy is a re-imagination on of the parser I've added to foonathan/lex, and I've simply kept a similar name.
152+
153+
ifdef::env-github[]
154+
== Documentation
155+
156+
The documentation, including tutorials, reference documentation, and an interactive playground can be found at https://lexy.foonathan.net/[lexy.foonathan.net].
157+
158+
A minimal `CMakeLists.txt` that uses lexy can look like this:
159+
160+
.`CMakeLists.txt`
161+
```cmake
162+
project(lexy-example)
163+
164+
include(FetchContent)
165+
FetchContent_Declare(lexy URL https://lexy.foonathan.net/download/lexy-src.zip)
166+
FetchContent_MakeAvailable(lexy)
167+
168+
add_executable(lexy_example)
169+
target_sources(lexy_example PRIVATE main.cpp)
170+
target_link_libraries(lexy_example PRIVATE foonathan::lexy)
171+
```
172+
173+
endif::[]
174+
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
// Copyright (C) 2020-2022 Jonathan Müller and lexy contributors
2+
// SPDX-License-Identifier: BSL-1.0
3+
4+
#ifndef LEXY_DETAIL_ASSERT_HPP_INCLUDED
5+
#define LEXY_DETAIL_ASSERT_HPP_INCLUDED
6+
7+
#include <lexy/_detail/config.hpp>
8+
9+
#ifndef LEXY_ENABLE_ASSERT
10+
11+
// By default, enable assertions if NDEBUG is not defined.
12+
13+
# if NDEBUG
14+
# define LEXY_ENABLE_ASSERT 0
15+
# else
16+
# define LEXY_ENABLE_ASSERT 1
17+
# endif
18+
19+
#endif
20+
21+
#if LEXY_ENABLE_ASSERT
22+
23+
// We want assertions: use assert() if that's available, otherwise abort.
24+
// We don't use assert() directly as that's not constexpr.
25+
26+
# if NDEBUG
27+
28+
# include <cstdlib>
29+
# define LEXY_PRECONDITION(Expr) ((Expr) ? void(0) : std::abort())
30+
# define LEXY_ASSERT(Expr, Msg) ((Expr) ? void(0) : std::abort())
31+
32+
# else
33+
34+
# include <cassert>
35+
36+
# define LEXY_PRECONDITION(Expr) ((Expr) ? void(0) : assert(Expr))
37+
# define LEXY_ASSERT(Expr, Msg) ((Expr) ? void(0) : assert((Expr) && (Msg)))
38+
39+
# endif
40+
41+
#else
42+
43+
// We don't want assertions.
44+
45+
# define LEXY_PRECONDITION(Expr) static_cast<void>(sizeof(Expr))
46+
# define LEXY_ASSERT(Expr, Msg) static_cast<void>(sizeof(Expr))
47+
48+
#endif
49+
50+
#endif // LEXY_DETAIL_ASSERT_HPP_INCLUDED
51+

0 commit comments

Comments
 (0)