My understanding is that basically only `git docker.io python` are
needed for oss-fuzz, so 5 GB of the largest packages and 20+ GB of
folders can be dropped from the CI image to make room for other stuff.
This PR enables using Jazzer.js for fuzzing Node.js projects in
OSS-Fuzz.
Part of #8324
---------
Co-authored-by: jonathanmetzman <31354670+jonathanmetzman@users.noreply.github.com>
* Add Centipede as a fuzzer
* Specify dictionary param of Centipede
* Update docs
* Mark Centipede as experimental
* More accurate description
* Remove garbage
* Simplify code
* Move mkdir to dockerfile
* Add the weak.c trick
* Install deps with Centipede's script & uninstall new deps
* Fix doc
* Reuse libweak_sancov_stubs.so
* Reorganise flags
* format
* Consistent file type
* Reuse the weak references defined in Centipede
* Replace the shared library of weak symbols with a static one
* Correct the place to call mkdir
* Allow 2G of SHM for Centipede
* Create dirs in run_fuzzer
* Keep Centipede up-to-date
* Avoid duplicating Centipede's binary
* The params of Centipede and their explanations
* The engine info of centipede
* Save the target binary (with san) in a subdir of the project
* Set the target (with san) dir in check_build
* Create the target (with san) first to avoid side-effects
* Fic clone
* Fix format
* Add periods
* Fix comments
* Fix dirs
* Fix parameters
* Adding Centipede as a fuzzing engine for Scarecrow
* Add CI support
* Represent sanitizer with a variable
* Remove the unnecessary definition of FUZZER_OUT
* Reorganise binary directories
* format
* A minor note
* Present issues with dirs that alread exist
* Use os.path.join to join path
* Make a function to get the out/ in check build
* Reusing existing flags in .bazel
* Avoid hardcoding sanitizer, set rss_limit_mb=4096, leave address_space_limit_mb disabled
* Better ways to add bazel build options
* A better way to add bazel flags
* Remove redundant --bazelrc
* Better Cohesion
* Avoid code duplication
* Simplify code
* Exit on crash
This test doesn't really do diffing or affected fuzzers properly.
Nor does it check the workspace for existence of certain things, nor
does check the filesystem for proof things happened. It is still
a WIP.
👋 hello there! I'm a fellow Googler who works on projects that leverage GitHub Actions for CI/CD. Recently I noticed a large increase in our queue time, and I've tracked it down to the [limit of 180 concurrent jobs](https://docs.github.com/en/actions/reference/usage-limits-billing-and-administration) for an organization. To help be better citizens, I'm proposing changes across a few repositories that will reduce GitHub Actions hours and consumption. I hope these changes are reasonable and I'm happy to talk through them in more detail.
- Only run GitHub Actions for pushes and PRs against the main branch of the repository. If your team uses a forking model, this change will not affect you. If your team pushes branches to the repository directly, this changes actions to only run against the primary branches or if you open a Pull Request against a primary branch.
- For long-running jobs (especially tests), I added the "Cancel previous" workflow. This is very helpful to prevent a large queue backlog when you are doing rapid development and pushing multiple commits. Without this, GitHub Actions' default behavior is to run all actions on all commits.
There are other changes you could make, depending on your project (but I'm not an expert):
- If you have tests that should only run when a subset of code changes, consider gating your workflow to particular file paths. For example, we have some jobs that do Terraform linting, but [they only run when Terraform files are changed](c4f59fee71/.github/workflows/terraform.yml (L3-L11)).
Hopefully these changes are not too controversial and also hopefully you can see how this would reduce actions consumption to be good citizens to fellow Googlers. If you have any questions, feel free to respond here or ping me on chat. Thank you!
Make unittests take 20 seconds to run instead of 35.
Make integration tests take 50 seconds to run instead of 6 minutes.
Make CI take 6 minutes instead of 12 minutes.
1. Allow running tests in parallel. Locally this takes the time for running all tests (including integration tests) from 6 minutes to ~50 seconds. We don't do parallel by default since it doesn't really save any time unless running integration tests on my machine (probably due to overhead of starting ~70 processes). This also speeds up CI from about 12 minutes to 6 minutes (since github actions has 2 cores per machine).
2. Fix how we run tests. I'm not exactly sure why, but the method we used for discovering tests, recursing through every directory and passing to unittest caused the build/infra tests to execute twice. Fixing this makes running unittests take ~20 seconds instead of ~35.
This change also uses pytest for running tests since it's easy to use it to run tests in parallel.
This change was made possible by #5113
1. Fix problem where permissions were being changed to root by non-root test (test was doing this by invoking test_all.py within docker).
2. Mark tests as integration tests so that cifuzz_test.py can be run in a reasonable amount of time.
3. Prevent some unittests from polluting source repo.
4. Add .venv to .gitignore
5. Rename test_test_all.py to the correctly formatted name "test_all_test.py"
* Update infra/base-images/all.sh
Add build of base-sanitizer-libs-builder and msan-libs-builder to this
shell script.
* msan: Don't warn on un-instrumented standard libs
These libraries do not need to be built with instrumentation, because
MemorySanitizer includes interceptors for them.
* Fix indentation
* Add missing docstrings
* Fix unused variable
* Fix invalid names
* Install python-apt on CI
* Revert "Install python-apt on CI"
This reverts commit d3da49cf90.
* Install and use python-apt in system directory
* Revert "Install and use python-apt in system directory"
This reverts commit e0ede101fb.
* Build python-apt from source
* Check out correct version of python-apt
* Fix octal literals
* More indentation fixes
* Add more missing docstrings
* Change variable names of opened files
* Remove unused import
* Ignore lints about package.Package API
* Fix or ignore remaining invalid names
* Fix apparent typo in compiler_wrapper_test.py
-z should precede a keyword, not a long option
* Fix use of xrange
* Style fixes, compiler_wrapper
* Fix apparent error in compiler_wrapper_test.py
Similar to the previous error, the test case would pass "-z
--no-undefined" to the linker. "-z" only has an effect when it is
followed by a keyword, otherwise ld ignores it and prints a warning
message. In this test case, "-z" and "--no-undefined" were passed in two
separate "-Wl," compiler arguments, but they reflect a common issue.
* Add missing license header
* Rename more functions
* Better name for global variable
* Rename methods of Package
* Rename functions in msan_builder.py
* Fix invalid variable names
* Fix useless-object-inheritance
* pylint: Fixes for Package and its subclasses
* Remove unused imports
* Indentation fixes
* Fix too-may-locals error in msan_build.py
* Add missing docstrings
* [util-linux] cover mnt_table_parse_stream
Waiting for https://github.com/karelzak/util-linux/pull/1068
* temporarily point OSS-Fuzz to evverx/util-linux
* make sure it can be built with sanitizer=coverage
Integrating the first cloud function i implemented which syncs the project list from github and uploads the list to cloud datastore, which will be used by another cloud function to request builds.
Co-authored-by: Kabeer Seth <kabeerseth@google.com>