public inbox for ~johnnyrichard/olang-devel@lists.sr.ht
 help / color / mirror / code / Atom feed
* [olang/patches/.build.yml] build failed
  2024-02-19 21:04 ` [PATCH olang v4 4/4] lexer: test: add integration tests for --dump-tokens Johnny Richard
@ 2024-02-19 20:07   ` builds.sr.ht
  0 siblings, 0 replies; 6+ messages in thread
From: builds.sr.ht @ 2024-02-19 20:07 UTC (permalink / raw)
  To: Johnny Richard; +Cc: ~johnnyrichard/olang-devel

olang/patches/.build.yml: FAILED in 42s

[Create --dump-tokens on compiler cli][0] v4 from [Johnny Richard][1]

[0]: https://lists.sr.ht/~johnnyrichard/olang-devel/patches/49682
[1]: mailto:johnny@johnnyrichard.com

✗ #1153624 FAILED olang/patches/.build.yml https://builds.sr.ht/~johnnyrichard/job/1153624

^ permalink raw reply	[flat|nested] 6+ messages in thread

* [PATCH olang v4 0/4] Create --dump-tokens on compiler cli
@ 2024-02-19 21:04 Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 1/4] utils: create string_view data structure Johnny Richard
                   ` (3 more replies)
  0 siblings, 4 replies; 6+ messages in thread
From: Johnny Richard @ 2024-02-19 21:04 UTC (permalink / raw)
  To: ~johnnyrichard/olang-devel; +Cc: Johnny Richard

This patchset creates the lexer and a --dump-tokens functionality for
the compiler 0c.

Carlos Maniero (1):
  lexer: test: add integration tests for --dump-tokens

Johnny Richard (3):
  utils: create string_view data structure
  lexer: create --dump-tokens cli command
  docs: create man page for 0c compiler

 .gitignore                     |   1 +
 docs/Makefile                  |  21 ++-
 docs/manpages/0c.md            |  21 +++
 examples/main_exit.0           |   3 +
 src/0c.c                       | 129 +++++++++++++++++-
 src/lexer.c                    | 235 +++++++++++++++++++++++++++++++++
 src/lexer.h                    |  74 +++++++++++
 src/string_view.c              |  35 +++++
 src/string_view.h              |  34 +++++
 tests/integration/cli_runner.c |  47 ++++++-
 tests/integration/cli_runner.h |   3 +-
 tests/integration/cli_test.c   |  16 ++-
 12 files changed, 604 insertions(+), 15 deletions(-)
 create mode 100644 docs/manpages/0c.md
 create mode 100644 examples/main_exit.0
 create mode 100644 src/lexer.c
 create mode 100644 src/lexer.h
 create mode 100644 src/string_view.c
 create mode 100644 src/string_view.h

-- 
2.43.2


^ permalink raw reply	[flat|nested] 6+ messages in thread

* [PATCH olang v4 1/4] utils: create string_view data structure
  2024-02-19 21:04 [PATCH olang v4 0/4] Create --dump-tokens on compiler cli Johnny Richard
@ 2024-02-19 21:04 ` Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 2/4] lexer: create --dump-tokens cli command Johnny Richard
                   ` (2 subsequent siblings)
  3 siblings, 0 replies; 6+ messages in thread
From: Johnny Richard @ 2024-02-19 21:04 UTC (permalink / raw)
  To: ~johnnyrichard/olang-devel; +Cc: Johnny Richard

Signed-off-by: Johnny Richard <johnny@johnnyrichard.com>
---
 src/string_view.c | 35 +++++++++++++++++++++++++++++++++++
 src/string_view.h | 34 ++++++++++++++++++++++++++++++++++
 2 files changed, 69 insertions(+)
 create mode 100644 src/string_view.c
 create mode 100644 src/string_view.h

diff --git a/src/string_view.c b/src/string_view.c
new file mode 100644
index 0000000..122eaa2
--- /dev/null
+++ b/src/string_view.c
@@ -0,0 +1,35 @@
+/*
+ * Copyright (C) 2024 olang maintainers
+ *
+ * This program is free software: you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License as published by
+ * the Free Software Foundation, either version 3 of the License, or
+ * (at your option) any later version.
+ *
+ * This program is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+ * GNU General Public License for more details.
+ *
+ * You should have received a copy of the GNU General Public License
+ * along with this program.  If not, see <https://www.gnu.org/licenses/>.
+ */
+#include "string_view.h"
+
+#include <stdbool.h>
+#include <string.h>
+
+bool
+string_view_eq_to_cstr(string_view_t str, char *cstr)
+{
+    size_t cstr_len = strlen(cstr);
+    if (str.size != cstr_len) {
+        return false;
+    }
+
+    size_t i = 0;
+    while (i < cstr_len && str.chars[i] == cstr[i]) {
+        i++;
+    }
+    return i == cstr_len;
+}
diff --git a/src/string_view.h b/src/string_view.h
new file mode 100644
index 0000000..367ef6b
--- /dev/null
+++ b/src/string_view.h
@@ -0,0 +1,34 @@
+/*
+ * Copyright (C) 2024 olang maintainers
+ *
+ * This program is free software: you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License as published by
+ * the Free Software Foundation, either version 3 of the License, or
+ * (at your option) any later version.
+ *
+ * This program is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+ * GNU General Public License for more details.
+ *
+ * You should have received a copy of the GNU General Public License
+ * along with this program.  If not, see <https://www.gnu.org/licenses/>.
+ */
+#ifndef STRING_VIEW_T
+#define STRING_VIEW_T
+
+#include <stdbool.h>
+#include <stddef.h>
+
+typedef struct string_view
+{
+    char *chars;
+    size_t size;
+
+} string_view_t;
+
+// TODO: missing unit test
+bool
+string_view_eq_to_cstr(string_view_t str, char *cstr);
+
+#endif /* STRING_VIEW_T */
-- 
2.43.2


^ permalink raw reply	[flat|nested] 6+ messages in thread

* [PATCH olang v4 2/4] lexer: create --dump-tokens cli command
  2024-02-19 21:04 [PATCH olang v4 0/4] Create --dump-tokens on compiler cli Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 1/4] utils: create string_view data structure Johnny Richard
@ 2024-02-19 21:04 ` Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 3/4] docs: create man page for 0c compiler Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 4/4] lexer: test: add integration tests for --dump-tokens Johnny Richard
  3 siblings, 0 replies; 6+ messages in thread
From: Johnny Richard @ 2024-02-19 21:04 UTC (permalink / raw)
  To: ~johnnyrichard/olang-devel; +Cc: Johnny Richard

This patch introduces the dump tokens interface and create the initial
setup for lexical analysis.

Signed-off-by: Johnny Richard <johnny@johnnyrichard.com>
---
 .gitignore                     |   1 +
 examples/main_exit.0           |   3 +
 src/0c.c                       | 131 +++++++++++++++++-
 src/lexer.c                    | 235 +++++++++++++++++++++++++++++++++
 src/lexer.h                    |  74 +++++++++++
 tests/integration/cli_runner.c |   4 +-
 tests/integration/cli_runner.h |   2 +-
 tests/integration/cli_test.c   |   2 +-
 8 files changed, 446 insertions(+), 6 deletions(-)
 create mode 100644 examples/main_exit.0
 create mode 100644 src/lexer.c
 create mode 100644 src/lexer.h

diff --git a/.gitignore b/.gitignore
index fe64668..92496d7 100644
--- a/.gitignore
+++ b/.gitignore
@@ -2,3 +2,4 @@
 build
 *.o
 docs/site.tar.gz
+tests/integration/*_test
diff --git a/examples/main_exit.0 b/examples/main_exit.0
new file mode 100644
index 0000000..c86fc68
--- /dev/null
+++ b/examples/main_exit.0
@@ -0,0 +1,3 @@
+fn main(): u32 {
+  return 0
+}
diff --git a/src/0c.c b/src/0c.c
index 33ac945..0af9caa 100644
--- a/src/0c.c
+++ b/src/0c.c
@@ -14,8 +14,135 @@
  * You should have received a copy of the GNU General Public License
  * along with this program.  If not, see <https://www.gnu.org/licenses/>.
  */
+#include <errno.h>
+#include <stdbool.h>
+#include <stdio.h>
+#include <stdlib.h>
+#include <string.h>
+
+#include "lexer.h"
+#include "string_view.h"
+
+typedef struct cli_args
+{
+    int argc;
+    char **argv;
+} cli_args_t;
+
+char *
+cli_args_shift(cli_args_t *args);
+
+typedef struct cli_opts
+{
+    // TODO: create man page instruction for --dump-tokens option
+    bool dump_tokens;
+    char *file_path;
+} cli_opts_t;
+
+void
+print_usage(FILE *stream, char *prog);
+
+static void
+print_token(char *file_path, token_t *token);
+
+string_view_t
+read_entire_file(char *file_path);
+
 int
-main(void)
+main(int argc, char **argv)
+{
+    cli_args_t args = { .argc = argc, .argv = argv };
+    cli_opts_t opts = { 0 };
+
+    char *prog = cli_args_shift(&args);
+
+    if (argc != 3) {
+        print_usage(stderr, prog);
+        return EXIT_FAILURE;
+    }
+
+    for (char *arg = cli_args_shift(&args); arg != NULL; arg = cli_args_shift(&args)) {
+        if (strcmp(arg, "--dump-tokens") == 0) {
+            opts.dump_tokens = true;
+        } else {
+            opts.file_path = arg;
+        }
+    }
+
+    if (!opts.dump_tokens) {
+        print_usage(stderr, prog);
+        return EXIT_FAILURE;
+    }
+
+    string_view_t file_content = read_entire_file(opts.file_path);
+
+    // TODO: missing integration test for lexer tokenizing
+    lexer_t lexer = { 0 };
+    lexer_init(&lexer, file_content);
+
+    token_t token = { 0 };
+    lexer_next_token(&lexer, &token);
+    while (token.kind != TOKEN_EOF) {
+        print_token(opts.file_path, &token);
+        lexer_next_token(&lexer, &token);
+    }
+    print_token(opts.file_path, &token);
+
+    free(file_content.chars);
+
+    return EXIT_SUCCESS;
+}
+
+char *
+cli_args_shift(cli_args_t *args)
+{
+    if (args->argc == 0)
+        return NULL;
+    --(args->argc);
+    return *(args->argv)++;
+}
+
+void
+print_usage(FILE *stream, char *prog)
+{
+    fprintf(stream, "usage: %s <source.0> --dump-tokens\n", prog);
+}
+
+string_view_t
+read_entire_file(char *file_path)
+{
+    FILE *stream = fopen(file_path, "rb");
+
+    if (stream == NULL) {
+        fprintf(stderr, "Could not open file %s: %s\n", file_path, strerror(errno));
+        exit(EXIT_FAILURE);
+    }
+
+    string_view_t file_content = { 0 };
+
+    fseek(stream, 0, SEEK_END);
+    file_content.size = ftell(stream);
+    fseek(stream, 0, SEEK_SET);
+
+    file_content.chars = (char *)malloc(file_content.size);
+
+    if (file_content.chars == NULL) {
+        fprintf(stderr, "Could not read file %s: %s\n", file_path, strerror(errno));
+        exit(EXIT_FAILURE);
+    }
+
+    fread(file_content.chars, 1, file_content.size, stream);
+    fclose(stream);
+
+    return file_content;
+}
+
+static void
+print_token(char *file_path, token_t *token)
 {
-    return 0;
+    printf("%s:%lu:%lu: <%s>\n",
+           file_path,
+           token->location.row + 1,
+           (token->location.offset - token->location.bol) + 1,
+           token_kind_to_cstr(token->kind));
 }
diff --git a/src/lexer.c b/src/lexer.c
new file mode 100644
index 0000000..b107762
--- /dev/null
+++ b/src/lexer.c
@@ -0,0 +1,235 @@
+/*
+ * Copyright (C) 2024 olang maintainers
+ *
+ * This program is free software: you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License as published by
+ * the Free Software Foundation, either version 3 of the License, or
+ * (at your option) any later version.
+ *
+ * This program is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+ * GNU General Public License for more details.
+ *
+ * You should have received a copy of the GNU General Public License
+ * along with this program.  If not, see <https://www.gnu.org/licenses/>.
+ */
+#include "lexer.h"
+
+#include <assert.h>
+#include <ctype.h>
+#include <stdbool.h>
+
+void
+lexer_init(lexer_t *lexer, string_view_t source)
+{
+    assert(lexer);
+    lexer->source = source;
+    lexer->offset = 0;
+    lexer->row = 0;
+    lexer->bol = 0;
+}
+
+static char
+lexer_current_char(lexer_t *lexer);
+
+static void
+lexer_skip_char(lexer_t *lexer);
+
+static bool
+lexer_is_eof(lexer_t *lexer);
+
+static bool
+lexer_is_not_eof(lexer_t *lexer);
+
+static bool
+_isspace(char c);
+
+static void
+lexer_init_char_value_token(lexer_t *lexer, token_t *token, token_kind_t kind);
+
+static void
+lexer_init_str_value_token(lexer_t *lexer, token_t *token, token_kind_t kind, size_t start_offset);
+
+static void
+lexer_init_eof_token(lexer_t *lexer, token_t *token);
+
+static token_kind_t
+lexer_str_to_token_kind(string_view_t text);
+
+void
+lexer_next_token(lexer_t *lexer, token_t *token)
+{
+    if (lexer_is_eof(lexer)) {
+        lexer_init_eof_token(lexer, token);
+        return;
+    }
+
+    char current_char = lexer_current_char(lexer);
+
+    if (_isspace(current_char)) {
+        while (_isspace(current_char) && lexer_is_not_eof(lexer)) {
+            lexer_skip_char(lexer);
+            current_char = lexer_current_char(lexer);
+        }
+    }
+
+    while (lexer_is_not_eof(lexer)) {
+        if (isalpha(current_char)) {
+            size_t start_offset = lexer->offset;
+            while (isalnum(current_char) && lexer_is_not_eof(lexer)) {
+                lexer_skip_char(lexer);
+                current_char = lexer_current_char(lexer);
+            }
+
+            string_view_t text = { .chars = lexer->source.chars + start_offset, .size = lexer->offset - start_offset };
+
+            lexer_init_str_value_token(lexer, token, lexer_str_to_token_kind(text), start_offset);
+            return;
+        }
+
+        if (isdigit(current_char)) {
+            size_t start_offset = lexer->offset;
+            while (isdigit(current_char) && lexer_is_not_eof(lexer)) {
+                lexer_skip_char(lexer);
+                current_char = lexer_current_char(lexer);
+            }
+
+            lexer_init_str_value_token(lexer, token, TOKEN_NUMBER, start_offset);
+            return;
+        }
+
+        switch (current_char) {
+            case '(': {
+                lexer_init_char_value_token(lexer, token, TOKEN_OPAREN);
+                lexer_skip_char(lexer);
+                return;
+            }
+            case ')': {
+                lexer_init_char_value_token(lexer, token, TOKEN_CPAREN);
+                lexer_skip_char(lexer);
+                return;
+            }
+            case ':': {
+                lexer_init_char_value_token(lexer, token, TOKEN_COLON);
+                lexer_skip_char(lexer);
+                return;
+            }
+            case '{': {
+                lexer_init_char_value_token(lexer, token, TOKEN_OCURLY);
+                lexer_skip_char(lexer);
+                return;
+            }
+            case '}': {
+                lexer_init_char_value_token(lexer, token, TOKEN_CCURLY);
+                lexer_skip_char(lexer);
+                return;
+            }
+            case '\n': {
+                lexer_init_char_value_token(lexer, token, TOKEN_LF);
+                lexer_skip_char(lexer);
+                return;
+            }
+            default: {
+                lexer_init_char_value_token(lexer, token, TOKEN_UNKNOWN);
+                lexer_skip_char(lexer);
+                return;
+            }
+        }
+    }
+
+    if (lexer_is_eof(lexer)) {
+        lexer_init_eof_token(lexer, token);
+        return;
+    }
+}
+
+static char *token_kind_str_table[] = {
+    [TOKEN_UNKNOWN] = "unknown", [TOKEN_IDENTIFIER] = "identifier",
+    [TOKEN_NUMBER] = "number",   [TOKEN_FN] = "fn",
+    [TOKEN_RETURN] = "return",   [TOKEN_LF] = "line_feed",
+    [TOKEN_OPAREN] = "(",        [TOKEN_CPAREN] = ")",
+    [TOKEN_COLON] = ":",         [TOKEN_OCURLY] = "{",
+    [TOKEN_CCURLY] = "}",        [TOKEN_EOF] = "EOF",
+};
+
+char *
+token_kind_to_cstr(token_kind_t kind)
+{
+    assert(kind < sizeof(token_kind_str_table));
+    return token_kind_str_table[kind];
+}
+
+static char
+lexer_current_char(lexer_t *lexer)
+{
+    return lexer->source.chars[lexer->offset];
+}
+
+static void
+lexer_skip_char(lexer_t *lexer)
+{
+    assert(lexer->offset < lexer->source.size);
+    if (lexer_current_char(lexer) == '\n') {
+        lexer->row++;
+        lexer->bol = ++lexer->offset;
+    } else {
+        lexer->offset++;
+    }
+}
+
+static bool
+lexer_is_eof(lexer_t *lexer)
+{
+    return lexer->offset >= lexer->source.size;
+}
+
+static bool
+lexer_is_not_eof(lexer_t *lexer)
+{
+    return !lexer_is_eof(lexer);
+}
+
+static bool
+_isspace(char c)
+{
+    return c != '\n' && isspace(c);
+}
+
+static void
+lexer_init_char_value_token(lexer_t *lexer, token_t *token, token_kind_t kind)
+{
+    string_view_t str = { .chars = lexer->source.chars + lexer->offset, .size = 1 };
+    token_loc_t location = { .offset = lexer->offset, .row = lexer->row, .bol = lexer->bol };
+    *token = (token_t){ .kind = kind, .value = str, .location = location };
+}
+
+static void
+lexer_init_str_value_token(lexer_t *lexer, token_t *token, token_kind_t kind, size_t start_offset)
+{
+    string_view_t str = { .chars = lexer->source.chars + start_offset, .size = lexer->offset - start_offset };
+    token_loc_t location = { .offset = start_offset, .row = lexer->row, .bol = lexer->bol };
+    *token = (token_t){ .kind = kind, .value = str, .location = location };
+}
+
+static void
+lexer_init_eof_token(lexer_t *lexer, token_t *token)
+{
+    string_view_t str = { 0 };
+    token_loc_t location = { .offset = lexer->offset, .row = lexer->row, .bol = lexer->bol };
+    *token = (token_t){ .kind = TOKEN_EOF, .value = str, .location = location };
+}
+
+static token_kind_t
+lexer_str_to_token_kind(string_view_t text)
+{
+    if (string_view_eq_to_cstr(text, "return")) {
+        return TOKEN_RETURN;
+    }
+
+    if (string_view_eq_to_cstr(text, "fn")) {
+        return TOKEN_FN;
+    }
+
+    return TOKEN_IDENTIFIER;
+}
diff --git a/src/lexer.h b/src/lexer.h
new file mode 100644
index 0000000..8c09e02
--- /dev/null
+++ b/src/lexer.h
@@ -0,0 +1,74 @@
+/*
+ * Copyright (C) 2024 olang maintainers
+ *
+ * This program is free software: you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License as published by
+ * the Free Software Foundation, either version 3 of the License, or
+ * (at your option) any later version.
+ *
+ * This program is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+ * GNU General Public License for more details.
+ *
+ * You should have received a copy of the GNU General Public License
+ * along with this program.  If not, see <https://www.gnu.org/licenses/>.
+ */
+#ifndef LEXER_H
+#define LEXER_H
+
+#include "string_view.h"
+#include <stdint.h>
+
+typedef struct lexer
+{
+    string_view_t source;
+    size_t offset;
+    size_t row;
+    size_t bol;
+} lexer_t;
+
+typedef enum token_kind
+{
+    TOKEN_UNKNOWN,
+    TOKEN_IDENTIFIER,
+    TOKEN_NUMBER,
+
+    // Keywords
+    TOKEN_FN,
+    TOKEN_RETURN,
+
+    // Single char
+    TOKEN_LF,
+    TOKEN_OPAREN,
+    TOKEN_CPAREN,
+    TOKEN_COLON,
+    TOKEN_OCURLY,
+    TOKEN_CCURLY,
+    TOKEN_EOF
+} token_kind_t;
+
+typedef struct token_loc
+{
+    size_t offset;
+    size_t row;
+    size_t bol;
+} token_loc_t;
+
+typedef struct token
+{
+    token_kind_t kind;
+    string_view_t value;
+    token_loc_t location;
+} token_t;
+
+void
+lexer_init(lexer_t *lexer, string_view_t source);
+
+void
+lexer_next_token(lexer_t *lexer, token_t *token);
+
+char *
+token_kind_to_cstr(token_kind_t kind);
+
+#endif /* LEXER_H */
diff --git a/tests/integration/cli_runner.c b/tests/integration/cli_runner.c
index 4e0f7c4..0531bcc 100644
--- a/tests/integration/cli_runner.c
+++ b/tests/integration/cli_runner.c
@@ -62,7 +62,7 @@ create_tmp_file_name(char *file_name)
 }
 
 cli_result_t
-cli_runner_compile_file(char *src)
+cli_runner_compiler_dump_tokens(char *src)
 {
     assert_compiler_exists();
 
@@ -70,7 +70,7 @@ cli_runner_compile_file(char *src)
     create_tmp_file_name(result.program_path);
 
     char command[1024];
-    sprintf(command, "%s -o %s %s", OLANG_COMPILER_PATH, result.program_path, src);
+    sprintf(command, "%s %s --dump-tokens", OLANG_COMPILER_PATH, src);
 
     result.exit_code = system(command);
     return result;
diff --git a/tests/integration/cli_runner.h b/tests/integration/cli_runner.h
index 5caa319..8f4d69a 100644
--- a/tests/integration/cli_runner.h
+++ b/tests/integration/cli_runner.h
@@ -23,5 +23,5 @@ typedef struct cli_result_t
 } cli_result_t;
 
 cli_result_t
-cli_runner_compile_file(char *src);
+cli_runner_compiler_dump_tokens(char *src);
 #endif
diff --git a/tests/integration/cli_test.c b/tests/integration/cli_test.c
index c7a9557..ce2ed91 100644
--- a/tests/integration/cli_test.c
+++ b/tests/integration/cli_test.c
@@ -21,7 +21,7 @@
 static MunitResult
 test_cli_hello_file(const MunitParameter params[], void *user_data_or_fixture)
 {
-    cli_result_t compilation_result = cli_runner_compile_file("../../examples/hello.olang");
+    cli_result_t compilation_result = cli_runner_compiler_dump_tokens("../../examples/main_exit.0");
     munit_assert_int(compilation_result.exit_code, ==, 0);
     return MUNIT_OK;
 }
-- 
2.43.2


^ permalink raw reply	[flat|nested] 6+ messages in thread

* [PATCH olang v4 3/4] docs: create man page for 0c compiler
  2024-02-19 21:04 [PATCH olang v4 0/4] Create --dump-tokens on compiler cli Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 1/4] utils: create string_view data structure Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 2/4] lexer: create --dump-tokens cli command Johnny Richard
@ 2024-02-19 21:04 ` Johnny Richard
  2024-02-19 21:04 ` [PATCH olang v4 4/4] lexer: test: add integration tests for --dump-tokens Johnny Richard
  3 siblings, 0 replies; 6+ messages in thread
From: Johnny Richard @ 2024-02-19 21:04 UTC (permalink / raw)
  To: ~johnnyrichard/olang-devel; +Cc: Johnny Richard

Since the 0c compiler contains a --dump-tokens functionality, it has
been documented as well.

The site build has been adapted to accommodate the manpages build.
Everything should work as before for site generation.

Signed-off-by: Johnny Richard <johnny@johnnyrichard.com>
---
 docs/Makefile       | 21 +++++++++++++++------
 docs/manpages/0c.md | 21 +++++++++++++++++++++
 src/0c.c            |  1 -
 3 files changed, 36 insertions(+), 7 deletions(-)
 create mode 100644 docs/manpages/0c.md

diff --git a/docs/Makefile b/docs/Makefile
index 731da5d..54561a1 100644
--- a/docs/Makefile
+++ b/docs/Makefile
@@ -1,14 +1,16 @@
 PANDOC     := pandoc
 INDEX      := index.md
 BUILD_DIR  := build
-TARGET     := $(BUILD_DIR)/index.html
+SITE_DIR   := $(BUILD_DIR)/site
+TARGET     := $(SITE_DIR)/index.html
 DIST_FILE  := site.tar.gz
 PAGES_DIR  := pages
+MANPAGES   := $(BUILD_DIR)/man
 PAGES      := $(wildcard $(PAGES_DIR)/*.md)
-HTML_PAGES := $(patsubst $(PAGES_DIR)/%.md, $(BUILD_DIR)/$(PAGES_DIR)/%.html, $(PAGES))
+HTML_PAGES := $(patsubst $(PAGES_DIR)/%.md, $(SITE_DIR)/$(PAGES_DIR)/%.html, $(PAGES))
 
 .PHONY: all
-all:  $(BUILD_DIR) $(TARGET) $(PAGES)
+all:  $(BUILD_DIR) $(TARGET) $(PAGES) manpages
 
 .PHONY: clean
 clean:
@@ -18,15 +20,22 @@ clean:
 .PHONY: dist
 dist: $(DIST_FILE)
 
+.PHONY: manpages
+manpages: $(BUILD_DIR) $(MANPAGES)/0c.1
+
+$(MANPAGES)/%.1: manpages/%.md
+	$(PANDOC) -s -t man $< > $@
+
 $(DIST_FILE): all
-	tar -czf $(DIST_FILE) -C $(BUILD_DIR) .
+	tar -czf $(DIST_FILE) -C $(SITE_DIR) .
 
 $(TARGET): $(HTML_PAGES)
 	$(PANDOC) -s --template template.html -f markdown -t html $(INDEX) > $(TARGET)
 
 $(BUILD_DIR):
 	@mkdir -p $@
-	@mkdir -p $@/$(PAGES_DIR)
+	@mkdir -p $(SITE_DIR)/$(PAGES_DIR)
+	@mkdir -p $(MANPAGES)
 
-$(BUILD_DIR)/$(PAGES_DIR)/%.html: $(PAGES_DIR)/%.md
+$(SITE_DIR)/$(PAGES_DIR)/%.html: $(PAGES_DIR)/%.md
 	$(PANDOC) -s --template template.html -f markdown -t html --toc $< > $@
diff --git a/docs/manpages/0c.md b/docs/manpages/0c.md
new file mode 100644
index 0000000..87a56df
--- /dev/null
+++ b/docs/manpages/0c.md
@@ -0,0 +1,21 @@
+% 0C(1)
+% olang mantainers
+% Feb 2024
+
+# NAME
+
+0c - zero langague compiler
+
+# SYNOPSIS
+
+**0c** **----dump-tokens** source.0
+
+# DESCRIPTION
+
+**0c** is the offical compiler for zero language, it is also a tool that
+contains utilities to help the language development.
+
+# GENERAL OPTIONS
+
+**----dump-tokens**
+:   Display lexical tokens given a soruce.0 code.
diff --git a/src/0c.c b/src/0c.c
index 0af9caa..e84559d 100644
--- a/src/0c.c
+++ b/src/0c.c
@@ -34,7 +34,6 @@ cli_args_shift(cli_args_t *args);
 
 typedef struct cli_opts
 {
-    // TODO: create man page instruction for --dump-tokens option
     bool dump_tokens;
     char *file_path;
 } cli_opts_t;
-- 
2.43.2


^ permalink raw reply	[flat|nested] 6+ messages in thread

* [PATCH olang v4 4/4] lexer: test: add integration tests for --dump-tokens
  2024-02-19 21:04 [PATCH olang v4 0/4] Create --dump-tokens on compiler cli Johnny Richard
                   ` (2 preceding siblings ...)
  2024-02-19 21:04 ` [PATCH olang v4 3/4] docs: create man page for 0c compiler Johnny Richard
@ 2024-02-19 21:04 ` Johnny Richard
  2024-02-19 20:07   ` [olang/patches/.build.yml] build failed builds.sr.ht
  3 siblings, 1 reply; 6+ messages in thread
From: Johnny Richard @ 2024-02-19 21:04 UTC (permalink / raw)
  To: ~johnnyrichard/olang-devel; +Cc: Carlos Maniero, Johnny Richard

From: Carlos Maniero <carlos@maniero.me>

We want to test the 0c compiler in a black box way.  This test explores
`pipe` in order to handle input/output data, and it tests the
--dump-tokens against the examples/main_exit.0.

Signed-off-by: Johnny Richard <johnny@johnnyrichard.com>
---
 src/0c.c                       |  1 -
 tests/integration/cli_runner.c | 47 ++++++++++++++++++++++++++++++----
 tests/integration/cli_runner.h |  1 +
 tests/integration/cli_test.c   | 14 ++++++++++
 4 files changed, 57 insertions(+), 6 deletions(-)

diff --git a/src/0c.c b/src/0c.c
index e84559d..978b770 100644
--- a/src/0c.c
+++ b/src/0c.c
@@ -75,7 +75,6 @@ main(int argc, char **argv)
 
     string_view_t file_content = read_entire_file(opts.file_path);
 
-    // TODO: missing integration test for lexer tokenizing
     lexer_t lexer = { 0 };
     lexer_init(&lexer, file_content);
 
diff --git a/tests/integration/cli_runner.c b/tests/integration/cli_runner.c
index 0531bcc..7e4fe9a 100644
--- a/tests/integration/cli_runner.c
+++ b/tests/integration/cli_runner.c
@@ -20,6 +20,7 @@
 #include <stdio.h>
 #include <stdlib.h>
 #include <string.h>
+#include <sys/wait.h>
 #include <unistd.h>
 
 #define OLANG_COMPILER_PATH "../../0c"
@@ -62,16 +63,52 @@ create_tmp_file_name(char *file_name)
 }
 
 cli_result_t
-cli_runner_compiler_dump_tokens(char *src)
+cli_runner_compiler(char *src, char *args[])
 {
     assert_compiler_exists();
 
-    cli_result_t result;
+    cli_result_t result = { 0 };
     create_tmp_file_name(result.program_path);
 
-    char command[1024];
-    sprintf(command, "%s %s --dump-tokens", OLANG_COMPILER_PATH, src);
+    int fd_link[2];
+
+    if (pipe(fd_link) == -1) {
+        perror("pipe error.");
+        exit(1);
+    }
+
+    pid_t pid = fork();
+
+    if (pid == -1) {
+        perror("fork error.");
+        exit(1);
+    }
+
+    if (pid == 0) {
+        dup2(fd_link[1], STDOUT_FILENO);
+        close(fd_link[0]);
+        close(fd_link[1]);
+
+        execv(OLANG_COMPILER_PATH, args);
+        perror("execl error.");
+        exit(127);
+    } else {
+        close(fd_link[1]);
+        if (read(fd_link[0], result.compiler_output, sizeof(result.compiler_output)) == -1) {
+            perror("read error.");
+            exit(1);
+        }
+        int status;
+        waitpid(pid, &status, 0);
+        result.exit_code = WEXITSTATUS(status);
+    }
 
-    result.exit_code = system(command);
     return result;
 }
+
+cli_result_t
+cli_runner_compiler_dump_tokens(char *src)
+{
+    char *program_args[] = { "0c", "--dump-tokens", src, NULL };
+    return cli_runner_compiler(src, program_args);
+}
diff --git a/tests/integration/cli_runner.h b/tests/integration/cli_runner.h
index 8f4d69a..7ce4e7b 100644
--- a/tests/integration/cli_runner.h
+++ b/tests/integration/cli_runner.h
@@ -20,6 +20,7 @@ typedef struct cli_result_t
 {
     int exit_code;
     char program_path[255];
+    char compiler_output[1024];
 } cli_result_t;
 
 cli_result_t
diff --git a/tests/integration/cli_test.c b/tests/integration/cli_test.c
index ce2ed91..1fd70c7 100644
--- a/tests/integration/cli_test.c
+++ b/tests/integration/cli_test.c
@@ -23,6 +23,20 @@ test_cli_hello_file(const MunitParameter params[], void *user_data_or_fixture)
 {
     cli_result_t compilation_result = cli_runner_compiler_dump_tokens("../../examples/main_exit.0");
     munit_assert_int(compilation_result.exit_code, ==, 0);
+    munit_assert_string_equal(compilation_result.compiler_output,
+                              "../../examples/main_exit.0:1:1: <fn>\n"
+                              "../../examples/main_exit.0:1:4: <identifier>\n"
+                              "../../examples/main_exit.0:1:8: <(>\n"
+                              "../../examples/main_exit.0:1:9: <)>\n"
+                              "../../examples/main_exit.0:1:10: <:>\n"
+                              "../../examples/main_exit.0:1:12: <identifier>\n"
+                              "../../examples/main_exit.0:1:16: <{>\n"
+                              "../../examples/main_exit.0:1:17: <line_feed>\n"
+                              "../../examples/main_exit.0:2:3: <return>\n"
+                              "../../examples/main_exit.0:2:10: <number>\n"
+                              "../../examples/main_exit.0:2:11: <line_feed>\n"
+                              "../../examples/main_exit.0:3:1: <}>\n"
+                              "../../examples/main_exit.0:3:2: <line_feed>\n");
     return MUNIT_OK;
 }
 
-- 
2.43.2


^ permalink raw reply	[flat|nested] 6+ messages in thread

end of thread, other threads:[~2024-02-19 20:07 UTC | newest]

Thread overview: 6+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-02-19 21:04 [PATCH olang v4 0/4] Create --dump-tokens on compiler cli Johnny Richard
2024-02-19 21:04 ` [PATCH olang v4 1/4] utils: create string_view data structure Johnny Richard
2024-02-19 21:04 ` [PATCH olang v4 2/4] lexer: create --dump-tokens cli command Johnny Richard
2024-02-19 21:04 ` [PATCH olang v4 3/4] docs: create man page for 0c compiler Johnny Richard
2024-02-19 21:04 ` [PATCH olang v4 4/4] lexer: test: add integration tests for --dump-tokens Johnny Richard
2024-02-19 20:07   ` [olang/patches/.build.yml] build failed builds.sr.ht

Code repositories for project(s) associated with this public inbox

	https://git.johnnyrichard.com/olang.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox