- Notifications
You must be signed in to change notification settings - Fork14.2k
Mirror of Apache Kafka
License
Apache-2.0, Apache-2.0 licenses found
Licenses found
apache/kafka
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.
You need to haveJava installed.
We build and test Apache Kafka with 17 and 23. Therelease
parameter in javac is set to11
for the clientsand streams modules, and17
for the rest, ensuring compatibility with their respectiveminimum Java versions. Similarly, therelease
parameter in scalac is set to11
for the streams modules and17
for the rest.
Scala 2.13 is the only supported version in Apache Kafka.
./gradlew jar
Follow instructions inhttps://kafka.apache.org/quickstart
./gradlew srcJar
./gradlew aggregatedJavadoc
./gradlew javadoc./gradlew javadocJar # builds a javadoc jar for each module./gradlew scaladoc./gradlew scaladocJar # builds a scaladoc jar for each module./gradlew docsJar # builds both (if applicable) javadoc and scaladoc jars for each module
./gradlew test # runs both unit and integration tests./gradlew unitTest./gradlew integrationTest./gradlew test -Pkafka.test.run.flaky=true # runs tests that are marked as flaky
./gradlew test --rerun-tasks./gradlew unitTest --rerun-tasks./gradlew integrationTest --rerun-tasks
./gradlew clients:test --tests RequestResponseTest
N=500; I=0; while [ $I -lt $N ] && ./gradlew clients:test --tests RequestResponseTest --rerun --fail-fast; do (( I=$I+1 )); echo "Completed run: $I"; sleep 1; done
./gradlew core:test --tests kafka.api.ProducerFailureHandlingTest.testCannotSendToInternalTopic./gradlew clients:test --tests org.apache.kafka.clients.MetadataTest.testTimeToNextUpdate
By default, there will be only small number of logs output while testing. You can adjust it by changing thelog4j2.yaml
file in the module'ssrc/test/resources
directory.
For example, if you want to see more logs for clients project tests, you can modifythe line inclients/src/test/resources/log4j2.yaml
tolevel: INFO
and then run:
./gradlew cleanTest clients:test --tests NetworkClientTest
And you should seeINFO
level logs in the file under theclients/build/test-results/test
directory.
Retries are disabled by default, but you can set maxTestRetryFailures and maxTestRetries to enable retries.
The following example declares -PmaxTestRetries=1 and -PmaxTestRetryFailures=3 to enable a failed test to be retried once, with a total retry limit of 3.
./gradlew test -PmaxTestRetries=1 -PmaxTestRetryFailures=3
SeeTest Retry Gradle Plugin for andbuild.yml more details.
Generate coverage reports for the whole project:
./gradlew reportCoverage -PenableTestCoverage=true -Dorg.gradle.parallel=false
Generate coverage for a single module, i.e.:
./gradlew clients:reportCoverage -PenableTestCoverage=true -Dorg.gradle.parallel=false
./gradlew clean releaseTarGz
The release file can be found inside./core/build/distributions/
.
Sometimes it is only necessary to rebuild the RPC auto-generated message data when switching between branches, as they couldfail due to code changes. You can just run:
./gradlew processMessages processTestMessages
Using compiled files:
KAFKA_CLUSTER_ID="$(./bin/kafka-storage.sh random-uuid)"./bin/kafka-storage.sh format --standalone -t $KAFKA_CLUSTER_ID -c config/server.properties./bin/kafka-server-start.sh config/server.properties
Using docker image:
docker run -p 9092:9092 apache/kafka:3.7.0
./gradlew clean
This is forcore
,examples
andclients
./gradlew core:jar./gradlew core:test
Streams has multiple sub-projects, but you can run all the tests:
./gradlew :streams:testAll
./gradlew tasks
Note Please ensure that JDK17 is used when developing Kafka.
IntelliJ supports Gradle natively and it will automatically check Java syntax and compatibility for each module, even ifthe Java version shown in theStructure > Project Settings > Modules
may not be the correct one.
When it comes to Eclipse, run:
./gradlew eclipse
Theeclipse
task has been configured to use${project_dir}/build_eclipse
as Eclipse's build directory. Eclipse's defaultbuild directory (${project_dir}/bin
) clashes with Kafka's scripts directory and we don't use Gradle's build directoryto avoid known issues with this configuration.
For the Streams archetype project, one cannot use gradle to upload to maven; instead themvn deploy
command needs to be called at the quickstart folder:
cd streams/quickstartmvn deploy
Please note for this to work you should create/update user maven settings (typically,${USER_HOME}/.m2/settings.xml
) to assign the following variables
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 https://maven.apache.org/xsd/settings-1.0.0.xsd">... <servers> ... <server> <id>apache.snapshots.https</id> <username>${maven_username}</username> <password>${maven_password}</password> </server> <server> <id>apache.releases.https</id> <username>${maven_username}</username> <password>${maven_password}</password> </server> ... </servers> ...
./gradlew -PskipSigning=true publishToMavenLocal
./gradlew -PskipSigning=true :streams:publishToMavenLocal
./gradlew testJar
There are two code quality analysis tools that we regularly run, spotbugs and checkstyle.
Checkstyle enforces a consistent coding style in Kafka.You can run checkstyle using:
./gradlew checkstyleMain checkstyleTest spotlessCheck
The checkstyle warnings will be found inreports/checkstyle/reports/main.html
andreports/checkstyle/reports/test.html
files in thesubproject build directories. They are also printed to the console. The build will fail if Checkstyle fails.For experiments (or regression testing purposes) add-PcheckstyleVersion=X.y.z
switch (to override project-defined checkstyle version).
The import order is a part of static check. please callspotlessApply
to optimize the imports of Java codes before filing pull request.
./gradlew spotlessApply
Spotbugs uses static analysis to look for bugs in the code.You can run spotbugs using:
./gradlew spotbugsMain spotbugsTest -x test
The spotbugs warnings will be found inreports/spotbugs/main.html
andreports/spotbugs/test.html
files in the subproject builddirectories. Use -PxmlSpotBugsReport=true to generate an XML report instead of an HTML one.
We useJMH to write microbenchmarks that produce reliable results in the JVM.
Seejmh-benchmarks/README.md for details on how to run the microbenchmarks.
The gradledependency debugging documentation mentions using thedependencies
ordependencyInsight
tasks to debug dependencies for the root project or individual subprojects.
Alternatively, use theallDeps
orallDepInsight
tasks for recursively iterating through all subprojects:
./gradlew allDeps./gradlew allDepInsight --configuration runtimeClasspath --dependency com.fasterxml.jackson.core:jackson-databind
These take the same arguments as the builtin variants.
./gradlew dependencyUpdates
The following options should be set with a-P
switch, for example./gradlew -PmaxParallelForks=1 test
.
commitId
: sets the build commit ID as .git/HEAD might not be correct if there are local commits added for build purposes.mavenUrl
: sets the URL of the maven deployment repository (file://path/to/repo
can be used to point to a local repository).maxParallelForks
: maximum number of test processes to start in parallel. Defaults to the number of processors available to the JVM.maxScalacThreads
: maximum number of worker threads for the scalac backend. Defaults to the lowest of8
and the number of processorsavailable to the JVM. The value must be between 1 and 16 (inclusive).ignoreFailures
: ignore test failures from junitshowStandardStreams
: shows standard out and standard error of the test JVM(s) on the console.skipSigning
: skips signing of artifacts.testLoggingEvents
: unit test events to be logged, separated by comma. For example./gradlew -PtestLoggingEvents=started,passed,skipped,failed test
.xmlSpotBugsReport
: enable XML reports for spotBugs. This also disables HTML reports as only one can be enabled at a time.maxTestRetries
: maximum number of retries for a failing test case.maxTestRetryFailures
: maximum number of test failures before retrying is disabled for subsequent tests.enableTestCoverage
: enables test coverage plugins and tasks, including bytecode enhancement of classes required to track saidcoverage. Note that this introduces some overhead when running tests and hence why it's disabled by default (the overheadvaries, but 15-20% is a reasonable estimate).keepAliveMode
: configures the keep alive mode for the Gradle compilation daemon - reuse improves start-up time. The values shouldbe one ofdaemon
orsession
(the default isdaemon
).daemon
keeps the daemon alive until it's explicitly stopped whilesession
keeps it alive until the end of the build session. This currently only affects the Scala compiler, seegradle/gradle#21034 for a PR that attempts to do the same for the Java compiler.scalaOptimizerMode
: configures the optimizing behavior of the scala compiler, the value should be one ofnone
,method
,inline-kafka
orinline-scala
(the default isinline-kafka
).none
is the scala compiler default, which only eliminates unreachable code.method
alsoincludes method-local optimizations.inline-kafka
adds inlining of methods within the kafka packages. Finally,inline-scala
alsoincludes inlining of methods within the scala library (which avoids lambda allocations for methods likeOption.exists
).inline-scala
isonly safe if the Scala library version is the same at compile time and runtime. Since we cannot guarantee this for all cases (for example, usersmay depend on the kafka jar for integration tests where they may include a scala library with a different version), we don't enable it bydefault. Seehttps://www.lightbend.com/blog/scala-inliner-optimizer for more details.
Seetests/README.md.
Apache Kafka is interested in building the community; we would welcome any thoughts orpatches. You can reach uson the Apache mailing lists.
To contribute follow the instructions here:
About
Mirror of Apache Kafka