ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • ELK스택 기본사용법 - ③ Logstash + 실제 데이터 활용해보기
    데이터베이스 2022. 10. 31. 16:57

     

     

    저번 시간에 이어 이번에는 ELK 스택의 Logstash 활용하는 방법을 정리해 보았다.

    학습한 내용과 사진, 코드는 모두 다음의 강의에서 참고하였다!🙌

     

    참고한 강의 (플레이리스트)

    https://www.youtube.com/watch?v=3iA-ncqAqYE&list=PLVNY1HnUlO24LCsgOxR_eK2Yi4sOgH9Pg&index=19 

     

     

    github : 강의 참고 자료

    https://github.com/minsuk-heo/BigData

     

    GitHub - minsuk-heo/BigData

    Contribute to minsuk-heo/BigData development by creating an account on GitHub.

    github.com

     

     

     

    이전 블로그 (Kibana)

    https://dodop-blog.tistory.com/411

     

    ELK스택 기본사용법 - ② Kibana

    저번 시간에 이어 이번에는 ELK 스택의 Kibana를 활용하는 방법을 정리해 보았다. 학습한 내용과 사진, 코드는 모두 다음의 강의에서 참고하였다!🙌 참고한 강의 (플레이리스트) ↓ https://www.youtube.

    dodop-blog.tistory.com

     

     

     

     

     

     

     

     

    Logstash

    세상에 있는 많은 데이터를 변환시키고 흡수해서 엘라스틱 서치에 보낼 수 있음

    Logstash는 수집한 여러 유형의 로그를 ElasticSearch와 같은 대상 서버에 인덱싱 후 전달하는 역할을 수행하는 데이터 처리 파이프라인이다.

    수집된 데이터들은 자기가 원하는 형식으로 데이터 포맷을 변경할 수 있다. 예를 들어 csv의 텍스트 들을 더하고 빼고 싶을 때 수치로 변환해서 계산을 수행할 수 있다.

     

     

     

     

     

     

     

    설치하기

    먼저 Logstash를 사용하기 위해 프로그램을 설치해보자. 단, 설치 전에 자바가 깔려 있어야 한다.

    $ brew install elastic/tap/logstash-full

    설치가 완료되면 다음 명령어를 이용해서 Logstash를 실행시킬 수 있다.

    $ brew services start elastic/tap/logstash-full

     

     

    config Logstash

    로그태시의 input과 output 설정은 다음과 같이 정할 수 있다.

    간단하게 키보드에서 인풋을 받고 스탠다드 아웃(모니터)로 출력을 받도록 설정해보자.

    기본적으로 Logstash 설정은 input, filter, output으로 구성되어 있다.

    % vi logstash-simple.conf
    input { 
    	stdin { } 
    }
    output {
    	stdout { }
    }

    다음의 명령어를 이용해서 프로그램이 설정파일과 함께 실행되도록 할 수 있다.

    ## 각자의 환경에 따라서 다른 Logstash 설치 위치의 bin으로 이동 
    % cd /opt/homebrew/Cellar/logstash-full/7.17.4/bin
    
    ## bin 에서 앞서 만들어준 logstash-simple.conf 파일 경로를 이용해 함께 실행 
    % sudo ./logstash -f [파일경로]/logstash-simple.conf
    Password:
    Using bundled JDK: /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/jdk.app/Contents/Home
    OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
    Sending Logstash logs to /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/logs which is now configured via log4j2.properties
    [2022-10-28T14:48:29,554][INFO ][logstash.runner          ] Log4j configuration path used is: /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/config/log4j2.properties
    [2022-10-28T14:48:29,566][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.17.4", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14.1+1 on 11.0.14.1+1 +indy +jit [darwin-x86_64]"}
    [2022-10-28T14:48:29,570][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djdk.io.File.enableADS=true, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
    [2022-10-28T14:48:29,599][INFO ][logstash.settings        ] Creating directory {:setting=>"path.queue", :path=>"/opt/homebrew/Cellar/logstash-full/7.17.4/libexec/data/queue"}
    [2022-10-28T14:48:29,609][INFO ][logstash.settings        ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/homebrew/Cellar/logstash-full/7.17.4/libexec/data/dead_letter_queue"}
    [2022-10-28T14:48:29,664][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2022-10-28T14:48:29,688][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"2c0f6ffb-c211-4964-b15c-02a30a999f46", :path=>"/opt/homebrew/Cellar/logstash-full/7.17.4/libexec/data/uuid"}
    [2022-10-28T14:48:31,163][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
    [2022-10-28T14:48:32,598][INFO ][org.reflections.Reflections] Reflections took 76 ms to scan 1 urls, producing 119 keys and 419 values 
    [2022-10-28T14:48:33,994][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/yunhalee/logstash-simple.conf"], :thread=>"#<Thread:0x5e3e882a run>"}
    [2022-10-28T14:48:35,074][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.05}
    WARNING: An illegal reflective access operation has occurred
    WARNING: Illegal reflective access by com.jrubystdinchannel.StdinChannelLibrary$Reader (file:/opt/homebrew/Cellar/logstash-full/7.17.4/libexec/vendor/bundle/jruby/2.5.0/gems/jruby-stdin-channel-0.2.0-java/lib/jruby_stdin_channel/jruby_stdin_channel.jar) to field java.io.FilterInputStream.in
    WARNING: Please consider reporting this to the maintainers of com.jrubystdinchannel.StdinChannelLibrary$Reader
    WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    [2022-10-28T14:48:35,464][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
    The stdin plugin is now waiting for input:
    [2022-10-28T14:48:35,525][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

    치는 대로 응답값을 반환하여 출력되는 것을 확인할 수 있다.

    hello
    {
           "message" => "hello",
        "@timestamp" => 2022-10-28T05:49:22.583Z,
          "@version" => "1",
              "host" => "[각자의 실행 환경]"
    }
    next step
    {
           "message" => "next step",
        "@timestamp" => 2022-10-28T05:49:31.002Z,
          "@version" => "1",
              "host" => "[각자의 실행 환경]"
    }
    ^C[2022-10-28T14:49:46,940][WARN ][logstash.runner          ] SIGINT received. Shutting down.
    [2022-10-28T14:49:47,219][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
    [2022-10-28T14:49:48,154][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
    [2022-10-28T14:49:48,208][INFO ][logstash.runner          ] Logstash shut down.

     

     

     

     

     

     

     

     

     

    실 데이터 분석해보기 ①

    이제 ELK 스택을 이용해서 실제 인구수 데이터 분석해보자.

     

    다음의 사이트에서 원하는 예제 데이터를 다운 받을 수 있다!

    (나는 찾을 수 없어서 그냥 github에 올라온 데이터를 활용하였다)

    https://catalog.data.gov/dataset

     

    Dataset - Catalog

    You can also access this registry using the API (see API Docs). Didn't find what you're looking for? Suggest a dataset here.

    catalog.data.gov

    먼저 Logstash를 실행시켜준다.

    이 데이터에 적용할 설정 파일은 다음과 같다.

    % vi logstash.conf
    input {
      file {
        path => "/[실 데이터 파일 경로]/populationbycountry19802010millions.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }
    filter {
      csv {
          separator => ","
          columns => ["Country","1980","1981","1982","1983","1984","1985","1986","1987","1988","1989","1990","1991","1992","1993","1994","1995","1996","1997","1998","1999","2000","2001","2002","2003","2004","2005","2006","2007","2008","2009","2010"]
      }
      mutate {convert => ["1980", "float"]}
      mutate {convert => ["1981", "float"]}
      mutate {convert => ["1982", "float"]}
      mutate {convert => ["1983", "float"]}
      mutate {convert => ["1984", "float"]}
      mutate {convert => ["1985", "float"]}
      mutate {convert => ["1986", "float"]}
      mutate {convert => ["1987", "float"]}
      mutate {convert => ["1988", "float"]}
      mutate {convert => ["1989", "float"]}
      mutate {convert => ["1990", "float"]}
      mutate {convert => ["1991", "float"]}
      mutate {convert => ["1992", "float"]}
      mutate {convert => ["1993", "float"]}
      mutate {convert => ["1994", "float"]}
      mutate {convert => ["1995", "float"]}
      mutate {convert => ["1996", "float"]}
      mutate {convert => ["1997", "float"]}
      mutate {convert => ["1998", "float"]}
      mutate {convert => ["1999", "float"]}
      mutate {convert => ["2000", "float"]}
      mutate {convert => ["2001", "float"]}
        mutate {convert => ["2002", "float"]}
      mutate {convert => ["2003", "float"]}
      mutate {convert => ["2004", "float"]}
      mutate {convert => ["2005", "float"]}
      mutate {convert => ["2006", "float"]}
      mutate {convert => ["2007", "float"]}
      mutate {convert => ["2008", "float"]}
      mutate {convert => ["2009", "float"]}
      mutate {convert => ["2010", "float"]}
    }
    output {
        elasticsearch {
            hosts => "localhost"
            index => "population"
        }
        stdout {}
    }

    예제 실 데이터 csv 파일을 다음의 명령어를 통해서 받아오자.

    예제 실 데이터 csv 파일은 다음에서 확인할 수 있다.

    https://github.com/minsuk-heo/BigData/blob/master/ch06/populationbycountry19802010millions.csv

     

    GitHub - minsuk-heo/BigData

    Contribute to minsuk-heo/BigData development by creating an account on GitHub.

    github.com

    % curl -O -L https://github.com/minsuk-heo/BigData/blob/master/ch06/populationbycountry19802010millions.csv

    이제 설정파일을 이용해서 Logstash를 실행시켜보자.

    % sudo ./logstash -f /[설정 파일경로]/logstash.conf
    
    [2022-10-28T15:11:05,475][INFO ][filewatch.observingtail  ][main][b8c84bbf47c7e4225b3a499d1e1ca38e7dddd4ab649da6c8760c1f506d48a467] START, creating Discoverer, Watch with file and sincedb collections
    {
              "2009" => 474.53897,
              "1997" => 403.41352,
              "1998" => 409.62879,
              "1995" => 390.75665,
              "2003" => 438.97976,
              "1987" => 338.59859,
              "2008" => 468.73872,
              "1996" => 397.13002,
              "1990" => 358.79973,
           "Country" => "Central & South America",
              "2007" => 462.89157,
              "2001" => 427.24012,
              "2002" => 433.05116,
              "1992" => 371.43224,
              "2004" => 445.01525,
              "2006" => 457.01699,
              "1994" => 384.26984,
              "1980" => 293.05856,
              "1999" => 415.63607,
              "host" => "[실행 환경]",
              "1984" => 318.87955,
              "2010" => 480.01228,
          "@version" => "1",
              "1985" => 325.22704,
              "1988" => 345.44544,
              "path" => "/[실 데이터 파일 경로]/populationbycountry19802010millions.csv",
              "1986" => 331.82291,
              "1981" => 299.43033,
              "1989" => 352.20471,
           "message" => "Central & South America,293.05856,299.43033,305.95253,312.51136,318.87955,325.22704,331.82291,338.59859,345.44544,352.20471,358.79973,365.15137,371.43224,377.7438,384.26984,390.75665,397.13002,403.41352,409.62879,415.63607,421.4539,427.24012,433.05116,438.97976,445.01525,451.05504,457.01699,462.89157,468.73872,474.53897,480.01228",
              "2000" => 421.4539,
              "1991" => 365.15137,
        "@timestamp" => 2022-10-28T06:11:05.777Z,
              "1983" => 312.51136,
              "1993" => 377.7438,
              "2005" => 451.05504,
              "1982" => 305.95253
    }
    //....

    이번엔 Kibana에서 받아온 자료들을 시각화 해보자. 인덱스를 받아온다.

    Discover를 이용해서 받아온 데이터 내용을 확인해보자.

    Korea필터를 이용해서 데이터를 찾아보자.

    여기서 1980과 2010, Country를 필터로 넣어 해당 년도의 인구수를 비교해서 확인해보자.

    이번엔 표를 이용해서 나라별 1980년대의 인구 수를 확인해보자.

    이번엔 파이차트로 확인해보자.

     

     

     

     

     

     

     

     

     

    실 데이터 분석해보기 ②

    이번엔 주식 데이터를 이용해서 실제 주식 데이터를 분석해보자.

    먼저 야후 finance에 들어가서 주식 데이터를 다운 받는다. 나는 강의 영상과 동일하게 Meta(구 페이스북) 자료를 다운 받았다.

    주식 페이지 -> historical data -> 기간 5년 설정 -> Apply -> Download

    https://finance.yahoo.com/quote/META/history?period1=1509408000&period2=1667088000&interval=1d&filter=history&frequency=1d&includeAdjustedClose=true

     

    Meta Platforms, Inc. (META) Stock Historical Prices & Data - Yahoo Finance

    Discover historical prices for META stock on Yahoo Finance. View daily, weekly or monthly format back to when Meta Platforms, Inc. stock was issued.

    finance.yahoo.com

    path -> 파일의 절대 경로만 지원

    start_position 기본적으로 end부터 시작한다. (로그의 경우 스타트 포인트가 end부터 시작) 여기서는 데이터가 csv이고 모두 들어있기 때문에 start로 포지션을 바꿔준다.

    sincedb_path는 dev/null로 주지 않으면 두번째 실습시 오류가 발생할 수 있다. (중복된 데이터)

    filter의 경우 ,로 구분하여 데이터를 받는다 column별로 데이터를 받는다.

    % vi logstash_stock.conf
    input {
      file {
        path => "[실데이터 csv파일 경로]"
        start_position => "beginning"
        sincedb_path => "/dev/null"    
      }
    }
    filter {
      csv {
          separator => ","
          columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
      }
      mutate {convert => ["Open", "float"]}
      mutate {convert => ["High", "float"]}
      mutate {convert => ["Low", "float"]}
      mutate {convert => ["Close", "float"]}
    }
    output {  
        elasticsearch {
            hosts => "localhost"
            index => "stock"
        }
        stdout {}
    }

     

    설정파일과 함께 실행해보면 데이터가 다음과 같이 잘 들어가는 것을 확인할 수 있다.

    % cd /opt/homebrew/Cellar/logstash-full/7.17.4/bin
    % sudo ./logstash -f /[config 파일 경로]/logstash_stock.conf
    Password:
    Using bundled JDK: /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/jdk.app/Contents/Home
    OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
    Sending Logstash logs to /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/logs which is now configured via log4j2.properties
    [2022-10-31T12:14:26,917][INFO ][logstash.runner          ] Log4j configuration path used is: /opt/homebrew/Cellar/logstash-full/7.17.4/libexec/config/log4j2.properties
    [2022-10-31T12:14:26,950][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.17.4", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14.1+1 on 11.0.14.1+1 +indy +jit [darwin-x86_64]"}
    [2022-10-31T12:14:26,957][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djdk.io.File.enableADS=true, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
    [2022-10-31T12:14:27,060][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2022-10-31T12:14:28,689][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
    [2022-10-31T12:14:30,530][INFO ][org.reflections.Reflections] Reflections took 77 ms to scan 1 urls, producing 119 keys and 419 values 
    [2022-10-31T12:14:32,888][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
    [2022-10-31T12:14:33,340][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2022-10-31T12:14:33,715][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
    [2022-10-31T12:14:33,850][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.4) {:es_version=>7}
    [2022-10-31T12:14:33,866][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
    [2022-10-31T12:14:34,041][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
    [2022-10-31T12:14:34,039][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
    [2022-10-31T12:14:34,201][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
    [2022-10-31T12:14:34,283][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/yunhalee/logstash_stock.conf"], :thread=>"#<Thread:0x306a49ae run>"}
    [2022-10-31T12:14:36,374][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>2.08}
    [2022-10-31T12:14:36,569][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
    [2022-10-31T12:14:36,647][INFO ][filewatch.observingtail  ][main][ab27d6a5298ac597649c7f0c065c03d0b9ccdffef192abf4f521ff0323d86143] START, creating Discoverer, Watch with file and sincedb collections
    [2022-10-31T12:14:36,697][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
    [2022-10-31T12:14:41,609][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"13312700", "Open"=>178.559998, "High"=>180.449997, "Date"=>"2017-11-06", "host"=>"oliui-MacBookPro.local", "Volume"=>"180.169998", "message"=>"2017-11-06,178.559998,180.449997,178.309998,180.169998,180.169998,13312700", "Low"=>178.309998, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.994Z, "Close"=>180.169998}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"AM8ILIQBfJKDlTaJvFIn", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,575][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"35529900", "Open"=>180.630005, "High"=>181.940002, "Date"=>"2017-11-02", "host"=>"oliui-MacBookPro.local", "Volume"=>"178.919998", "message"=>"2017-11-02,180.630005,181.940002,177.339996,178.919998,178.919998,35529900", "Low"=>177.339996, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.994Z, "Close"=>178.919998}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"b88ILIQBfJKDlTaJvFEm", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,575][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"12928200", "Open"=>180.5, "High"=>180.75, "Date"=>"2017-11-07", "host"=>"oliui-MacBookPro.local", "Volume"=>"180.250000", "message"=>"2017-11-07,180.500000,180.750000,178.960007,180.250000,180.250000,12928200", "Low"=>178.960007, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.995Z, "Close"=>180.25}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"fs8ILIQBfJKDlTaJvFAj", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,609][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"40918300", "Open"=>182.360001, "High"=>182.899994, "Date"=>"2017-11-01", "host"=>"oliui-MacBookPro.local", "Volume"=>"182.660004", "message"=>"2017-11-01,182.360001,182.899994,180.570007,182.660004,182.660004,40918300", "Low"=>180.570007, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.994Z, "Close"=>182.660004}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"E88ILIQBfJKDlTaJvFEm", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,617][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"17822100", "Open"=>179.289993, "High"=>179.860001, "Date"=>"2017-11-03", "host"=>"oliui-MacBookPro.local", "Volume"=>"178.919998", "message"=>"2017-11-03,179.289993,179.860001,176.710007,178.919998,178.919998,17822100", "Low"=>176.710007, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.994Z, "Close"=>178.919998}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"gM8ILIQBfJKDlTaJvFAk", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,621][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"10494100", "Open"=>179.789993, "High"=>180.350006, "Date"=>"2017-11-08", "host"=>"oliui-MacBookPro.local", "Volume"=>"179.559998", "message"=>"2017-11-08,179.789993,180.350006,179.110001,179.559998,179.559998,10494100", "Low"=>179.110001, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.995Z, "Close"=>179.559998}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"f88ILIQBfJKDlTaJvFAj", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,616][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"20174200", "Open"=>180.570007, "High"=>180.800003, "Date"=>"2017-10-31", "host"=>"oliui-MacBookPro.local", "Volume"=>"180.059998", "message"=>"2017-10-31,180.570007,180.800003,178.940002,180.059998,180.059998,20174200", "Low"=>178.940002, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.993Z, "Close"=>180.059998}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"_88ILIQBfJKDlTaJvFEn", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,684][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"13018000", "Open"=>179.300003, "High"=>179.979996, "Date"=>"2017-11-17", "host"=>"oliui-MacBookPro.local", "Volume"=>"179.000000", "message"=>"2017-11-17,179.300003,179.979996,178.899994,179.000000,179.000000,13018000", "Low"=>178.899994, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.998Z, "Close"=>179.0}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"O88ILIQBfJKDlTaJvFEm", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    [2022-10-31T12:14:41,684][WARN ][logstash.outputs.elasticsearch][main][6ce34df4648a4f9fee47d0a721bff397fea10069e9cd9b2a8aa660b741b9f80f] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"stock", :routing=>nil}, {"path"=>"/Users/yunhalee/Downloads/META.csv", "Adj Close"=>"15607600", "Open"=>178.759995, "High"=>179.830002, "Date"=>"2017-11-16", "host"=>"oliui-MacBookPro.local", "Volume"=>"179.589996", "message"=>"2017-11-16,178.759995,179.830002,178.500000,179.589996,179.589996,15607600", "Low"=>178.5, "@version"=>"1", "@timestamp"=>2022-10-31T03:14:36.997Z, "Close"=>179.589996}], :response=>{"index"=>{"_index"=>"stock", "_type"=>"_doc", "_id"=>"G88ILIQBfJKDlTaJvFIn", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [Date] cannot be changed from type [text] to [date]"}}}}
    {
              "path" => "/[csv 파일 경로]",
         "Adj Close" => "Volume",
              "Open" => 0.0,
              "High" => 0.0,
              "Date" => "Date",
              "host" => "[실행 환경]",
            "Volume" => "Adj Close",
           "message" => "Date,Open,High,Low,Close,Adj Close,Volume",
               "Low" => 0.0,
          "@version" => "1",
        "@timestamp" => 2022-10-31T03:14:36.951Z,
             "Close" => 0.0
    }
    { //...

    이제 Kibana에서 들어온 인덱스를 받아오자.

    데이터가 잘 들어왔다면 discover로 확인해보자.

    line 차트로도 확인이 가능하다.

     

     

    완성 ✨

     

     

     

    ( 참고한 사이트 ✨)

    https://kim-dragon.tistory.com/20

     

    ELK(ElasticSearch, Logstash, Kibana) 란? ELK Stack 이란?

    ELK란? ELK는 3가지 오픈소스 소프트웨어 Elastic Search, LogStatsh, Kibana의 조합을 말합니다. 각 제품이 연동되어 데이터 수집 및 분석 툴로서 동작합니다. Elastic이라는 기업명에 걸맞게 높은 확장성과

    kim-dragon.tistory.com

     

Designed by Tistory.