0

I am trying to figure out the use ofstrings.Join method compared toregular concatenation with +=.

For this comparison, I am using both methods on os.Args as the list of strings to concatenate.

My code for the concatenating functions is:

func regular_concatenation() {    var s, sep string    for i := 1; i < len(os.Args); i++ {        s += sep + os.Args[i]        sep = " "    }    fmt.Println(s)}func join_concatenation() {    fmt.Println(strings.Join(os.Args, " "))}

And the main function for the performance check is:

func main() {    var end_1, end_2 float64    var start_1, start_2 time.Time    start_1 = time.Now()    for i:=0; i < 100; i++ {        ex1_3_join_concatenation()    }    end_1 = time.Since(start_1).Seconds()    start_2 = time.Now()    for i:=0; i < 100; i++ {        ex1_3_regular_concatenation()    }    end_2 = time.Since(start_2).Seconds()    fmt.Println(end_1)    fmt.Println(end_2)}

Problem is - when I run the code, say with 20 arguments (os.Args), I get the result that thestrings.Join method is slower than the regular concatination.

This is confusing for me, because the way I understood it - when using regular += method, it creates a new string reference each time (because strings are immutable in golang), therefore the garbage collector is supposed to run in order to collect the unused data and this wastes time.

So the question is - is strings.Join really a faster method? And if it is - what am I doing wrong in this example?

Jonathan Hall's user avatar
Jonathan Hall
80.4k19 gold badges161 silver badges206 bronze badges
askedMar 14, 2020 at 23:25
Roy Levy's user avatar
5
  • 2
    Start by creating a real, independent benchmarks; poor benchmarks give poor results. The join method will have fewer allocations, and will be faster provided there aren't other confounding variables. Yourjoin function also handles one more argument than the other.CommentedMar 15, 2020 at 0:47
  • You have to google an info on how to benchmark go code, here's nothing to discuss, I thinkCommentedMar 15, 2020 at 1:06
  • well, I would suggest you ask the question about what's' wrong with this benchmark. Coz your simple question has a simple answer thatstrings.Join is faster than string concatenation.CommentedMar 15, 2020 at 5:31
  • 2
    One difference may be that you are adding os.Args[0] in the 2nd benchmark.CommentedMar 15, 2020 at 6:51
  • Try using an array of 1000 or so elements instead of os.Args and you will see that strings.Join is faster every time.CommentedMar 2, 2021 at 11:30

1 Answer1

4

Due to various compiler optimizations string concatenationcan be quite efficient but in you case I found thatstrings.Join is faster (see benchmarks of your code below).

In general for building up a string it is recommended to usestrings.Builder. SeeHow to efficiently concatenate strings in go .

BTW you should be using the brilliant benchmarking facility that comes with Go. Just put these functions in a file ending with_test.go (eg string_test.go) and rungo test -bench=..

func BenchmarkConcat(b *testing.B) { // 132 ns/op    ss := []string {"sadsadsa", "dsadsakdas;k", "8930984"}    for i := 0; i < b.N; i++ {        var s, sep string        for j := 0; j < len(ss); j++ {            s += sep + ss[j]            sep = " "        }        _ = s    }}func BenchmarkJoin(b *testing.B) {  // 56.7 ns/op    ss := []string {"sadsadsa", "dsadsakdas;k", "8930984"}    for i := 0; i < b.N; i++ {        s := strings.Join(ss, " ")        _ = s    }}func BenchmarkBuilder(b *testing.B) { // 58.5    ss := []string {"sadsadsa", "dsadsakdas;k", "8930984"}    for i := 0; i < b.N; i++ {        var s strings.Builder        // Grow builder to expected max length (maybe this        // needs to be calculated dep. on your requirements)        s.Grow(32)        var sep string        for j := 0; j < len(ss); j++ {            s.WriteString(ss[j])            s.WriteString(sep)            sep = " "        }        _ = s.String()    }}
answeredMar 15, 2020 at 5:55
Andrew W. Phillips's user avatar
Sign up to request clarification or add additional context in comments.

7 Comments

There's nothing to benchmark. Concatenation explicitly makes allocation per each meanwhile strings.Join is using a preallocated buffer. Ofc it might be faster in the case of 1-2 element concatenation because of less work.
There's no need to use a builder because it's what strings.Join do. In case of the known size, I'd usemake([]byte, size) andcopy if it requires to be fast
If there is nothing to benchmark it is weird that the code that i used with the "time" testing does not show the wanted results that you are speaking of (that strings.Join is faster either way), isn't it?
@Laevus Dexter sure creating your own buffer then usingcopy could perhaps be a little faster but you should provide the code so we can benchmark it. Personally I doubt it is much faster thanstrings.Join and it's always better to use a know good standard library function. BTW I don't understand what you mean by "there's nothing to benchmark" unless you mean that it's so obvious which is faster that there is no point.
sure I upvoted this answer because it is really helpful information, but I can't why the answer to his true question - if join supposed to be faster, why in his case (without benchmark) it's not?
|

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.