Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Jmeter JTL parsing with Logstash for Elasticseacrh and Influxdb ...

License

NotificationsYou must be signed in to change notification settings

anasoid/jmeter-logstash

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Jmeter JTL ans statiscts file parsing with Logstash and elasticsearch, you can find imageonDocker Hub, statistics.json file is generated with jmeter htmlreport.

2. Quick reference

2.1. Image version

2.2. Features

  1. Parse Standard JTL (CSV Format).
  2. Possibility to filter requests based on regex filter (include and exclude filter) .
  3. Flag samplers generated by TransactionController based on regex (by default '.+').
  4. For TransactionController, calculate number of failing sampler and total.
  5. Add relative time to compare different Executions, variable TESTSTART.MS should be logged withpropertysample_variables
  6. Add Project name, test name, environment and executionId to organize results and compare different execution.
  7. Split Label name to have multi tags by request (by default split by '/').
  8. Flag subresult, when there is a redirection 302,.. Subrequest has a have a suffix like "-xx" when xx is the ordernumber
  9. Supporting ElasticSearch and can be adapted for other tools.
  10. can also index custom field logged in file withproperty :sample_variables

2.3. Content

2.4. Image Variants

Thejmeter-logstash images come in many flavors, each designed for a specific use case.The images version are based on component used to build image, default use elasticsearch output:

  1. Logstash Version: 7.17.9 -> default for 7.17.

3. Getting Started

3.1. Create ElasticSearch stack (Only if using ElasticSearch & Kibana)

  1. Create Optional docker network (Called jmeter). If not used remove "--net jmeter " from all following docker commandand adapt Elasticsearch url.
docker network create jmeter
  1. Start elastic search container , Or use any Elasticsearch instance you have already installed.
docker run --name jmeter-elastic  --net jmeter \-p 9200:9200 -p 9300:9300 \-e"ES_JAVA_OPTS=-Xms1024m -Xmx1024m" \-e"xpack.security.enabled=false" \-e"discovery.type=single-node" \docker.elastic.co/elasticsearch/elasticsearch:8.4.1
  1. Start Kibana and connect it to elastic search Using environnement variableELASTICSEARCH_HOSTS.
docker run --name jmeter-kibana  --net jmeter -p 5601:5601 -e"ELASTICSEARCH_HOSTS=http://jmeter-elastic:9200" docker.elastic.co/kibana/kibana:8.4.1

3.2. Run Logstash

  1. In the project folder create a folder named 'input' or you can use any input folder in your machine.
  2. If you choose to use a different input folder, you should change"${PWD}/input" on the following command by yourinput folder.

3.2.1. Run With image from docker hub for Elasticsearch (way 1, preferred)

#Run Imagedocker run --rm -it --net jmeter -e"ELASTICSEARCH_HOSTS=http://jmeter-elastic:9200" -v${PWD}/input:/input/ anasoid/jmeter-logstash

3.3. Dashboards

3.3.1. Kibana

Download Dashboards fromKibana Dashboards and go to Stack management kibana ->saved object for import.

Main DashboardCompare Dashboard

3.4. HOW-TO

  1. To exit after all files parsed use (-e "FILE_EXIT_AFTER_READ=true") should be used with (-e "FILE_READ_MODE=read") ..
  2. To not remove container logstash after execution not use --rm from arguments.
  3. Logstash keep information on position f last line parsed in a file sincedb, this file by default is on the path/usr/share/logstash/data/plugins/inputs/file, if you use the same container this file will be persisted even yourestart logstash cotainer, and if you need to maintain this file even you remove container you can mount volumefolder in the path (/usr/share/logstash/data/plugins/inputs/file)
  4. To have relative time for comparison test start time should be logged :in user.properties file (sample_variables=TESTSTART.MS,...) or add properties with file using -q argument or directlyin command line with (-Jsample_variables=TESTSTART.MS,..)seeFull list of command-line options

3.4.1. Example

Run Logstash without remove container after stop.

docker run   -it --net jmeter -e"ELASTICSEARCH_HOSTS=http://jmeter-elastic:9200" \-v${PWD}/input:/input/ \anasoid/jmeter-logstash

Run Logstash and remove file after the end of reading.

docker run --rm -it --net jmeter -e"ELASTICSEARCH_HOSTS=http://jmeter-elastic:9200" \-v${PWD}/input:/input/ \-e"FILE_READ_MODE=read" \-e"FILE_COMPLETED_ACTION=delete" \anasoid/jmeter-logstashRun Logstash with external sincedb folder.```shelldocker run --rm -it --net jmeter -e"ELASTICSEARCH_HOSTS=http://jmeter-elastic:9200" \-v${PWD}/input:/input/ \-v${PWD}/.sincedb:/usr/share/logstash/data/plugins/inputs/file \anasoid/jmeter-logstash

3.5. Parameters

3.5.1. ElasticSearch configuration

Environment variablesDescriptionDefault
ELASTICSEARCH_HOSTSElasticsearch output configurationhosts (ex:http://elasticsearch:9200 )
ELASTICSEARCH_INDEXElasticsearch output configurationindexjmeter-jtl-%{+YYYY.MM.dd}
ELASTICSEARCH_USERElasticsearch output configurationuser
ELASTICSEARCH_PASSWORDElasticsearch output configurationpassword
ELASTICSEARCH_SSL_VERIFICATIONElasticsearch output configurationssl_certificate_verificationtrue
ELASTICSEARCH_HTTP_COMPRESSIONElasticsearch output configurationhttp_compressionfalse
ELASTICSEARCH_VERSIONElasticsearch template version, shoud be the same as elasticsearch version (not logstash version), valid values are 7 and 8. Only logstash 7 who can work with Elasticsearch 7 and 8, logstash 8 work only with elasticsearch 8.7 for logstash 7.x et 8 for logstash 8.x

3.5.2. Logstash

Environment variablesDescriptionDefault
INPUT_PATHDefault folder input used for JTL and statistics file./input
INPUT_PATH_JTLDefault folder input used for JTL, pattern : (["${INPUT_PATH:/input}/**.jtl","${INPUT_PATH_JTL:/input}/**.jtl"])/input
INPUT_PATH_STATDefault folder input used statistics , pattern : (["${INPUT_PATH:/input}/**.json","${INPUT_PATH_STAT:/input}/**.json"])/input
PROJECT_NAMEProject nameundefined
ENVIRONMENT_NAMEEnvironment name, if not provided will try to extract value from file name ( {test_name}-{environment-name}-{execution_id} )undefined
TEST_NAMETest name, if not provided will try to extract value from file name ( {test_name}-{environment-name}-{execution_id} or {test_name}-{execution_id} or {test_name})undefined
TEST_METADATATest metadata as key value ex : (version=v1,type=daily,region=europe).
TEST_TAGSTest tags, can add testtags fields as liste of value ex : (v1,daily,europe) version.
EXECUTION_IDExecution Id, if not provided will try to extract value from file name ( {test_name}-{environment-name}-{execution_id} or {test_name}-{execution_id} )undefined
FILE_READ_MODEFile input configurationmodetail
FILE_START_POSITIONFile input configurationstart_positionbeginning
FILE_EXIT_AFTER_READFile input configurationexit_after_readfalse
FILE_COMPLETED_ACTIONFile input configurationfile_completed_actionlog
MISSED_RESPONSE_CODEDefault response code when not present in response like on timeout case510
PARSE_LABELS_SPLIT_CHARChar to split label into labels/
PARSE_TRANSACTION_REGEXRegex to identify transaction Label_.+_
PARSE_TRANSACTION_AUTODetect transaction controller based on URL null, and message format.true
PARSE_FILTER_INCLUDE_SAMPLER_REGEXRegex used to include samplers and transactions.
PARSE_FILTER_EXCLUDE_SAMPLER_REGEXRegex used to exclude samplers and transactions.
PARSE_REMOVE_TRANSACTIONRemove transaction.false
PARSE_REMOVE_SAMPLERRemove sampler, not transaction.false
PARSE_REMOVE_MESSAGE_FIELDRemove field message.true
PARSE_CLEANUP_FIELDSRemove fields : host, path.true
PARSE_WITH_FLAG_SUBRESULTFlag result with prefix like have a suffix like "-xx" when xx is the order numbertrue
PCT1percent 1 value for statistics report90
PCT2percent 1 value for statistics report95
PCT3percent 1 value for statistics report99

3.6. Fields

3.6.1. Common

Common field are integrated during file parsing for both JTL ans statistics file.

FieldsType ELKsourceDescription
originstring-Origin of messagejtl orstat
@timestampdateelkInsertion time in Elastic search
environmentstringinput/parsingTarget environment (Ex: dev, stage ..), as input using environment variable or extracted from filename.
executionidstringinput/parsingUnique id to identify data for a test, as input using environment variable or extracted from filename.
filenamestringparsingfile name without extension.
pathstringlogstashPath of file.
projectstringinputProject name.
testnamestringparsingTest name, as input using environment variable or extracted from filename.
testtagsstringparsingList of keywords extracted by splitting environnement variable "TEST_TAGS" from environment variable
timestampdatecsvRequest time. Accept timestamp format in ms or "yyyy/MM/dd HH:mm:ss.SSS" .
labelstringcsvsampler label
labelsstringparsingList of keywords extracted by splitting label using the char "PARSE_LABELS_SPLIT_CHAR" from environment variable , default to "/"
globalLabelstringparsingNormalized label for subresults, when there redirection (ex 302) jmeter log all redirects requests and the parent one by default (jmeter.save.saveservice.subresults=false to disable), the parent will have the normal label and other subresut will have a suffix like "-xx" when xx is the order number, in this field you will find the original label for all subresult without the number suffix (see field : subresult, redirectLevel )

3.6.2. JTL

Fields for JTL File.For csv field see documentation onCSV Log format.

For additional fields see documentationonResults file configuration.

FieldsType ELKsourceDescription
Connectlongcsvtime to establish connection
IdleTimelongcsvnumber of milliseconds of 'Idle' time (normally 0)
Latencylongcsvtime to first response
URLstringcsv
allThreadslongcsvtotal number of active threads in all groups
byteslongcsvnumber of bytes in the sample
dataTypestringcsve.g. text
domainstringparsingdomain name or ip which is extracted from url.
elapsedlongcsvelapsed - in milliseconds
failureMessagestringcsv
grpThreadslongcsvnumber of active threads in this thread group
hoststringelkhostname of logstash node.
redirectLevellongparsingredirect number (see field:globalLabel)
relativetimefloatparsingNumber of milliseconds from test started. Useful to compare test. this field need to have started test time logged to csv (add this variable nameTESTSTART.MS to propertysample_variables)
requeststringparsingRequest path if Http/s request.
responseCodestringcsv
responseMessagestringcsv
responseStatuslongparsingNumeric responseCode , if responseCode is not numeric (case on timeout) using value "MISSED_RESPONSE_CODE" from environment variable , default to 510.
sentByteslongcsvnumber of bytes sent for the sample.
subresultbooleanparsingtrue if sample is a sub result (see field:globalLabel)
successbooleancsvtrue or false.
teststartdatecsvTest start time. This field need to have started test time logged to csv (add this variable nameTESTSTART.MS to propertysample_variables)
threadGrpIdlongparsingThe number of thread group. (Extract from threadName) .
threadGrpNamestringparsingThe name of thread group.(Extract from threadName) .
workerNodestringparsinghost port of worker node. (Extract from threadName) .
threadNumberlongparsingThe number of thread (unique in thread group). (Extract from threadName) .
threadNamestringcsvThread name, unique on test.
transactionbooleanparsingIS Sampler transaction , generated by transaction controller, to identify transaction label should start and and with "_", the regex used to this is "_.+_"
transactionFailingSamplerlongparsingIf sample is transaction, this value represent number of failing sampler.
transactionTotalSamplerlongparsingIf sample is transaction, this value represent count of total sampler.

3.6.3. Statistics (Only ElasticSearch)

Fields for Statistics

For percentiles configuration (pc1,pc2,pc3)seePercentiles configuration.

FieldsType ELKsourceDescription
isTotalbooleanParsingIs total line, label will be "Total"
sampleCountlongParsingSample count
errorPctlongParsingPercent of error
errorCountlongParsingError count
receivedKBytesPerSeclongParsingReceived Kilo Bytes per seconds
sentKBytesPerSeclongParsingSent Kilo Bytes per seconds
throughputlongParsingThroughput
pct1ResTimelongParsingPercentile 1 response time (aggregate_rpt_pct1)
pct90longParsingpercentile 1 value, field name can be changed with PCT1 seeconfiguration
pct2ResTimelongParsingPercentile 2 response time (aggregate_rpt_pct2)
pct95longParsingpercentile 2 value, field name can be changed with PCT1 seeconfiguration
pct3ResTimelongParsingPercentile 3 response time (aggregate_rpt_pct3)
pct99longParsingpercentile 3 value, field name can be changed with PCT1 seeconfiguration
minResTimelongParsingMinimum response time.
meanResTimelongParsingMean response time.
medianResTimelongParsingMedian response time.
maxResTimelongParsingMAximum response time.

4. Troubleshooting & Limitation

  1. Logstash instance can't parse CSV file with different header Format, as first header will be used for all file, ifyou have files with different format you should use each time a new instance or restart the instance.
  2. Change sincedb file can't done on logstash with Elasticsearch without building image.
  3. Label with suffix '-{number}' will be considered as subresult, so don't prefix label with '-{number}' or disablesubresult flag with PARSE_WITH_FLAG_SUBRESULT.

About

Jmeter JTL parsing with Logstash for Elasticseacrh and Influxdb ...

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors3

  •  
  •  
  •  

[8]ページ先頭

©2009-2025 Movatter.jp