scp -i keypair-asdf.pem -r hadoop@asdf:~/tez.tar.gz . scp -i keypair-qwer.pem -r tez.tar.gz hadoop@qwer-emr:~/tez.tar.gz https://doheejin.github.io/linux/2021/03/03/linux-scp.html [Linux] scp 명령어로 (로컬↔서버) 파일 전송 scp는 SecureCopy의 약자로, 원격서버에 있는 파일과 폴더를 전송하거나 가져오기 위해 사용하는 명령어이다.ssh 원격 접속 프로토콜을 기반으로 하며, ssh와 동일한 22번 포트를 이용하기 때문에 passw doheejin.github.io
https://stackoverflow.com/questions/51933568/how-to-retrieve-hive-table-partition-location How to retrieve Hive table Partition Location? Show Partitions --> In Hive/Spark, this command only provides the Partition, without providing the location information on hdfs/s3 Since we maintain different location for each partition in a tab... stackoverflow.com
https://www.projectpro.io/recipes/explain-study-of-spark-query-execution-plans-using-explain Explain Study of Spark query execution plans using explain() - This recipe explains Study of Spark query execution plans using explain() www.projectpro.io
aws s3 sync . s3://asdf/a/b/c/ --delete aws s3 sync s3://my-bucket s3://my-other-bucket \ --exclude 'customers/*' \ --exclude 'orders/*' \ --exclude 'reportTemplate/*' https://stackoverflow.com/questions/32393026/exclude-multiple-folders-using-aws-s3-sync Exclude multiple folders using AWS S3 sync How to exclude multiple folders while using aws s3 syn ? I tried : # aws s3 sync s3://inksedge-app..
--conf spark.driver.maxResultSize=4g https://wooono.tistory.com/41 [Spark] spark.driver.maxResultSize 오류 오류 org.apache.spark.SparkException: Job aborted due to stage failure: Total size of serialized results of XXXX tasks (X.0 GB) is bigger than spark.driver.maxResultSize (X.0 GB) 원인 RDD로 분산 돼 있던 데이터를 collect() 등을 사용해 dri wooono.tistory.com
https://stackoverflow.com/questions/27932345/downloading-folders-from-aws-s3-cp-or-sync Downloading folders from aws s3, cp or sync? If I want to download all the contents of a directory on S3 to my local PC, which command should I use cp or sync ? Any help would be highly appreciated. For example, if I want to download all... stackoverflow.com
https://community.cloudera.com/t5/Support-Questions/How-to-set-yarn-application-name-of-hive-job/td-p/185524 How to set yarn application name of hive job I use HDP 2.4,use hive on Tez. And i want to set the job name show in yarn resource manger page, Now the hive job name like HIVE-2f58f71e-4c29-4092-ac04-6e63c15ee223 and application is tez. How should I set the name of hive job name to let it s..
hdfs dfsadmin -report hdfs fsck -list-corruptfileblocks hdfs fsck -delete https://118k.tistory.com/469 [hadoop][fsck] HDFS의 상태를 점검 할 수 있는 명령어 HDFS의 fsck 명령- HDFS 상의 다양한 불일치(블록 누락, 복제 되지 않은 블록)를 확인- 오류를 발견하고 수정하지는 않음(NameNode가 복구가능한 오류는 자동으로 수정)- 열린 파일은 무시함 > hadoop fsck / 118k.tistory.com
fun main(args: Array){ val array: Array = arrayOf("a", "b", "c", "d", "e") val list: List = array.toList() list.forEach { println(it) } } https://codechacha.com/ko/kotlin-convert-list-to-array/ Kotlin - Array를 List로 변환 코틀린에서 배열을 리스트로 변환하는 방법을 소개합니다. `toList()`는 List를 Array로 변환합니다. `toMutableList()`는 List가 아닌 MutableList로 리턴합니다. 다음과 같이 `listOf()`로 변환할 수 있습니다. codechacha.com
aws s3 ls --summarize --human-readable --recursive s3://bucket-name/ https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket How can I get the size of an Amazon S3 bucket? I'd like to graph the size (in bytes, and # of items) of an Amazon S3 bucket and am looking for an efficient way to get the data. The s3cmd tools provide a way to get the total file size using s3c..
aws emr list-clusters --active | jq -r ".Clusters[].Id" | while read id ; do dns=$(aws emr describe-cluster --cluster-id $id | jq -r ".Cluster.MasterPublicDnsName") dns=$(echo $dns | sed -r "s/ip-([0-9]+)-([0-9]+)-([0-9]+)-([0-9]+)\.ap-northeast-2\.compute\.internal/\1.\2.\3.\4/g") name=$(aws emr describe-cluster --cluster-id $id | jq -r ".Cluster.Name") echo $dns $name done # sudo vi /etc/hosts..
location.href = document.querySelector('#reload-button') .url .replace(/ip-(\d+)-(\d+)-(\d+)-(\d+)/,"$1.$2.$3.$4") .replace(".ap-northeast-2.compute.internal", "") https://stackoverflow.com/questions/29989031/getting-the-current-domain-name-in-chrome-when-the-page-fails-to-load Getting the current domain name in Chrome when the page fails to load If you try to load with Chrome: http://sdqdsqdqsd..
cmd="rm .gitignore" echo "$cmd" eval "$cmd" https://unix.stackexchange.com/questions/356534/how-to-run-string-with-values-as-a-command-in-bash How to run string with values as a command in bash? Here is my small bash script snippet. i=5 command='echo $i' $command I want this script to print 5 i.e., I want it to run echo and print 5. But it instead keeps printing $i. So how do I go about unix.sta..
TABLE target PARTITION (YEAR, MONTH) SELECT A, B, C, YEAR, MONTH FROM temp https://stackoverflow.com/questions/40143249/hive-insert-overwrite-into-a-partitioned-table HIVE Insert overwrite into a partitioned Table I ran a insert overwrite on a partitioned table. After the command, say for example the below partitions are created. a,b,c,d,e Now when I rerun the Insert overwrite table, but this ti..
val df = spark.read.option("header","true").orc("/DATA/UNIVERSITY/DEPT/STUDENT/part-00000.orc") df.printSchema() https://stackoverflow.com/questions/58288941/how-to-get-the-schema-columns-and-their-types-of-orc-files-stored-in-hdfs How to get the schema (columns and their types) of ORC files stored in HDFS? I have ORC files stored in different folders on HDFS as follows: /DATA/UNIVERSITY/DEPT/ST..
https://docs.snowflake.com/ko/sql-reference/sql/create-function.html CREATE FUNCTION — Snowflake Documentation 함수가 NULL 값을 반환할 수 있거나 NON-NULL 값만 반환해야 할지 여부를 지정합니다. 기본값은 NULL입니다(즉, 이 함수는 NULL을 반환할 수 있음). 참고 현재, NOT NULL 절은 SQL UDFs에 대해 적용되지 않습 docs.snowflake.com https://stackoverflow.com/questions/51743367/hive-create-function-if-not-exists Hive: Create function if not exists At the start of my h..
https://leetcode.com/problems/decode-ways/description/ Decode Ways - LeetCode Level up your coding skills and quickly land a job. This is the best place to expand your knowledge and get prepared for your next interview. leetcode.com https://www.techiedelight.com/ko/count-decodings-sequence-of-digits/ 주어진 숫자 시퀀스의 카운트 디코딩 양수가 주어지면 해당 숫자를 매핑 테이블의 해당 알파벳에 매핑합니다. [(1, 'A'), (2, 'B'), … (26, 'Z')], 가능..
https://simon-aubury.medium.com/kafka-with-avro-vs-kafka-with-protobuf-vs-kafka-with-json-schema-667494cbb2af Kafka with AVRO vs., Kafka with Protobuf vs., Kafka with JSON Schema Experiments with Kafka serialisation schemes — playing with AVRO, Protobuf, JSON Schema in Confluent Streaming Platform. The code for… simon-aubury.medium.com
from inspect import signature def my_func(a, b, c, param_name='apple'): pass value = signature(my_func).parameters['param_name'].default print(value == 'apple') # True value = signature(my_func).parameters['param_name'].default https://stackoverflow.com/questions/12627118/get-a-function-arguments-default-value Get a function argument's default value? For this function def eat_dog(name, should_di..
master region server zookeeper hfile row key memstore wal rolling replay flush https://it-sunny-333.tistory.com/175 [HBase] 데이터 Read/Write 과정 (memstore, WAL, HFile) HDFS는 데이터를 읽을 때 랜덤 엑세스가 아니고 Full scan을 한다. 그리고 update가 불가하며 append만 가능하다. HBase는 HDFS에 있는 데이터를 랜덤 엑세스 하게 해주고 데이터를 update 할 수 있게 해준다. it-sunny-333.tistory.com
https://medium.com/@limgyumin/%EC%BD%94%ED%8B%80%EB%A6%B0-%EC%9D%98-apply-with-let-also-run-%EC%9D%80-%EC%96%B8%EC%A0%9C-%EC%82%AC%EC%9A%A9%ED%95%98%EB%8A%94%EA%B0%80-4a517292df29 코틀린 의 apply, with, let, also, run 은 언제 사용하는가? 원문 : “Kotlin Scoping Functions apply vs. with, let, also, and run” medium.com
https://support.atlassian.com/bitbucket-cloud/docs/set-up-or-run-parallel-steps/ Set up or run parallel steps | Bitbucket Cloud | Atlassian Support In Bitbucket Cloud, parallel steps enable you to build and test faster, by running a set of self-contained steps at the same time. support.atlassian.com
https://dydwnsekd.tistory.com/62 Airflow에서 Jinja template 사용하기 Airflow에서는 Jinja2 template를 내장하고 있어 이를 활용할 수 있는데, Jinja2 template에 대한 자세한 내용은 Jinja Document를 참고하기 바란다. https://jinja.palletsprojects.com/en/3.0.x/ Jinja2 template를 활용할 수 있 dydwnsekd.tistory.com yesterday = date_utils.date_to_string(kwargs['logical_date'].in_timezone("Asia/Seoul")) today = date_utils.add_date(yesterday, 1)
https://stackoverflow.com/questions/49234471/when-to-execute-refresh-table-my-table-in-spark/72970987#72970987 When to execute REFRESH TABLE my_table in spark? Consider a code; import org.apache.spark.sql.hive.orc._ import org.apache.spark.sql._ val path = ... val dataFrame:DataFramew = ... val hiveContext = new org.apache.spark.sql.hive.HiveContext( stackoverflow.com
SELECT column FROM table AS T1 INNER JOIN Params AS P1 ON T1.column LIKE '%' + P1.param + '%'; https://stackoverflow.com/questions/4612282/dynamic-like-statement-in-sql Dynamic Like Statement in SQL I've been racking my brain on how to do this for a while, and i know that some genius on this site will have the answer. Basically i'm trying to do this: SELECT column FROM table WHERE [tabl... stack..
https://stackoverflow.com/questions/42723604/how-to-set-spark-job-staging-location How to set Spark job staging location My spark job is failing because the user doesn't have access to directory where spark is trying to write staging or temp dataset. 2017-03-10 10:25:47,0928 ERROR JniCommon fs/client/fileclient... stackoverflow.com https://doc.hcs.huawei.com/en-us/usermanual/mrs/mrs_03_0298.html..
f"{{{{jinja}}}}_{python}" https://stackoverflow.com/questions/63788781/use-python-f-strings-and-jinja-at-the-same-time Use Python f-strings and Jinja at the same time I am trying to write a concise SQL query string in Python, to make use of both f-strings and Jinja at the same time. Background info: I am writing a query used in Airflow. This did not work: query_... stackoverflow.com
- Total
- Today
- Yesterday
- 연애학개론
- 테슬라 레퍼럴 코드 확인
- Kluge
- 개리마커스
- 할인
- 테슬라 리퍼럴 코드 생성
- 모델y
- 테슬라 레퍼럴 적용 확인
- 테슬라 리퍼럴 코드
- follower
- 메디파크 내과 전문의 의학박사 김영수
- 인스타그램
- 책그림
- 모델 Y 레퍼럴
- 테슬라 크레딧 사용
- COUNT
- 클루지
- 테슬라
- 유투브
- 테슬라 레퍼럴
- 테슬라 리퍼럴 코드 혜택
- 어떻게 능력을 보여줄 것인가?
- Bot
- 김달
- 테슬라 추천
- 팔로워 수 세기
- 레퍼럴
- wlw
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | ||||||
2 | 3 | 4 | 5 | 6 | 7 | 8 |
9 | 10 | 11 | 12 | 13 | 14 | 15 |
16 | 17 | 18 | 19 | 20 | 21 | 22 |
23 | 24 | 25 | 26 | 27 | 28 |