--conf spark.driver.maxResultSize=4g https://wooono.tistory.com/41 [Spark] spark.driver.maxResultSize 오류 오류 org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized results of XXXX tasks (X.0 GB) is bigger than spark.driver.maxResultSize (X.0 GB) 원인 RDD로 분산 돼 있던 데이터를 collect() 등을 사용해 dri wooono.tistory.com
https://stackoverflow.com/questions/27932345/downloading-folders-from-aws-s3-cp-or-sync Downloading folders from aws s3, cp or sync? If I want to download all the contents of a directory on S3 to my local PC, which command should I use cp or sync ? Any help would be highly appreciated. For example, if I want to download all... stackoverflow.com
https://community.cloudera.com/t5/Support-Questions/How-to-set-yarn-application-name-of-hive-job/td-p/185524 How to set yarn application name of hive job I use HDP 2.4,use hive on Tez. And i want to set the job name show in yarn resource manger page, Now the hive job name like HIVE-2f58f71e-4c29-4092-ac04-6e63c15ee223 and application is tez. How should I set the name of hive job name to let it s..
hdfs dfsadmin -report hdfs fsck -list-corruptfileblocks hdfs fsck -delete https://118k.tistory.com/469 [hadoop][fsck] HDFS의 상태를 점검 할 수 있는 명령어 HDFS의 fsck 명령- HDFS 상의 다양한 불일치(블록 누락, 복제 되지 않은 블록)를 확인- 오류를 발견하고 수정하지는 않음(NameNode가 복구가능한 오류는 자동으로 수정)- 열린 파일은 무시함 > hadoop fsck / 118k.tistory.com
fun main(args: Array){ val array: Array = arrayOf("a", "b", "c", "d", "e") val list: List = array.toList() list.forEach { println(it) } } https://codechacha.com/ko/kotlin-convert-list-to-array/ Kotlin - Array를 List로 변환 코틀린에서 배열을 리스트로 변환하는 방법을 소개합니다. `toList()`는 List를 Array로 변환합니다. `toMutableList()`는 List가 아닌 MutableList로 리턴합니다. 다음과 같이 `listOf()`로 변환할 수 있습니다. codechacha.com
aws s3 ls --summarize --human-readable --recursive s3://bucket-name/ https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket How can I get the size of an Amazon S3 bucket? I'd like to graph the size (in bytes, and # of items) of an Amazon S3 bucket and am looking for an efficient way to get the data. The s3cmd tools provide a way to get the total file size using s3c..
aws emr list-clusters --active | jq -r ".Clusters[].Id" | while read id ; do dns=$(aws emr describe-cluster --cluster-id $id | jq -r ".Cluster.MasterPublicDnsName") dns=$(echo $dns | sed -r "s/ip-([0-9]+)-([0-9]+)-([0-9]+)-([0-9]+)\.ap-northeast-2\.compute\.internal/\1.\2.\3.\4/g") name=$(aws emr describe-cluster --cluster-id $id | jq -r ".Cluster.Name") echo $dns $name done # sudo vi /etc/hosts..
location.href = document.querySelector('#reload-button') .url .replace(/ip-(\d+)-(\d+)-(\d+)-(\d+)/,"$1.$2.$3.$4") .replace(".ap-northeast-2.compute.internal", "") https://stackoverflow.com/questions/29989031/getting-the-current-domain-name-in-chrome-when-the-page-fails-to-load Getting the current domain name in Chrome when the page fails to load If you try to load with Chrome: http://sdqdsqdqsd..
cmd="rm .gitignore" echo "$cmd" eval "$cmd" https://unix.stackexchange.com/questions/356534/how-to-run-string-with-values-as-a-command-in-bash How to run string with values as a command in bash? Here is my small bash script snippet. i=5 command='echo $i' $command I want this script to print 5 i.e., I want it to run echo and print 5. But it instead keeps printing $i. So how do I go about unix.sta..
TABLE target PARTITION (YEAR, MONTH) SELECT A, B, C, YEAR, MONTH FROM temp https://stackoverflow.com/questions/40143249/hive-insert-overwrite-into-a-partitioned-table HIVE Insert overwrite into a partitioned Table I ran a insert overwrite on a partitioned table. After the command, say for example the below partitions are created. a,b,c,d,e Now when I rerun the Insert overwrite table, but this ti..
val df = spark.read.option("header","true").orc("/DATA/UNIVERSITY/DEPT/STUDENT/part-00000.orc") df.printSchema() https://stackoverflow.com/questions/58288941/how-to-get-the-schema-columns-and-their-types-of-orc-files-stored-in-hdfs How to get the schema (columns and their types) of ORC files stored in HDFS? I have ORC files stored in different folders on HDFS as follows: /DATA/UNIVERSITY/DEPT/ST..
https://docs.snowflake.com/ko/sql-reference/sql/create-function.html CREATE FUNCTION — Snowflake Documentation 함수가 NULL 값을 반환할 수 있거나 NON-NULL 값만 반환해야 할지 여부를 지정합니다. 기본값은 NULL입니다(즉, 이 함수는 NULL을 반환할 수 있음). 참고 현재, NOT NULL 절은 SQL UDFs에 대해 적용되지 않습 docs.snowflake.com https://stackoverflow.com/questions/51743367/hive-create-function-if-not-exists Hive: Create function if not exists At the start of my h..
https://leetcode.com/problems/decode-ways/description/ Decode Ways - LeetCode Level up your coding skills and quickly land a job. This is the best place to expand your knowledge and get prepared for your next interview. leetcode.com https://www.techiedelight.com/ko/count-decodings-sequence-of-digits/ 주어진 숫자 시퀀스의 카운트 디코딩 양수가 주어지면 해당 숫자를 매핑 테이블의 해당 알파벳에 매핑합니다. [(1, 'A'), (2, 'B'), … (26, 'Z')], 가능..
https://simon-aubury.medium.com/kafka-with-avro-vs-kafka-with-protobuf-vs-kafka-with-json-schema-667494cbb2af Kafka with AVRO vs., Kafka with Protobuf vs., Kafka with JSON Schema Experiments with Kafka serialisation schemes — playing with AVRO, Protobuf, JSON Schema in Confluent Streaming Platform. The code for… simon-aubury.medium.com
from inspect import signature def my_func(a, b, c, param_name='apple'): pass value = signature(my_func).parameters['param_name'].default print(value == 'apple') # True value = signature(my_func).parameters['param_name'].default https://stackoverflow.com/questions/12627118/get-a-function-arguments-default-value Get a function argument's default value? For this function def eat_dog(name, should_di..
master region server zookeeper hfile row key memstore wal rolling replay flush https://it-sunny-333.tistory.com/175 [HBase] 데이터 Read/Write 과정 (memstore, WAL, HFile) HDFS는 데이터를 읽을 때 랜덤 엑세스가 아니고 Full scan을 한다. 그리고 update가 불가하며 append만 가능하다. HBase는 HDFS에 있는 데이터를 랜덤 엑세스 하게 해주고 데이터를 update 할 수 있게 해준다. it-sunny-333.tistory.com
https://medium.com/@limgyumin/%EC%BD%94%ED%8B%80%EB%A6%B0-%EC%9D%98-apply-with-let-also-run-%EC%9D%80-%EC%96%B8%EC%A0%9C-%EC%82%AC%EC%9A%A9%ED%95%98%EB%8A%94%EA%B0%80-4a517292df29 코틀린 의 apply, with, let, also, run 은 언제 사용하는가? 원문 : “Kotlin Scoping Functions apply vs. with, let, also, and run” medium.com
https://support.atlassian.com/bitbucket-cloud/docs/set-up-or-run-parallel-steps/ Set up or run parallel steps | Bitbucket Cloud | Atlassian Support In Bitbucket Cloud, parallel steps enable you to build and test faster, by running a set of self-contained steps at the same time. support.atlassian.com
- Total
- Today
- Yesterday
- 팔로워 수 세기
- 책그림
- 레퍼럴
- 테슬라 리퍼럴 코드
- 테슬라 레퍼럴 코드 확인
- 어떻게 능력을 보여줄 것인가?
- 유투브
- 테슬라 추천
- follower
- 연애학개론
- 테슬라
- wlw
- 클루지
- 테슬라 레퍼럴
- 개리마커스
- 인스타그램
- 모델y
- 김달
- 할인
- Kluge
- 테슬라 레퍼럴 적용 확인
- 메디파크 내과 전문의 의학박사 김영수
- 테슬라 리퍼럴 코드 혜택
- 모델 Y 레퍼럴
- 테슬라 리퍼럴 코드 생성
- COUNT
- 테슬라 크레딧 사용
- Bot
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |