val df = spark.read.option("header","true").orc("/DATA/UNIVERSITY/DEPT/STUDENT/part-00000.orc") df.printSchema() https://stackoverflow.com/questions/58288941/how-to-get-the-schema-columns-and-their-types-of-orc-files-stored-in-hdfs How to get the schema (columns and their types) of ORC files stored in HDFS? I have ORC files stored in different folders on HDFS as follows: /DATA/UNIVERSITY/DEPT/ST..
https://docs.snowflake.com/ko/sql-reference/sql/create-function.html CREATE FUNCTION — Snowflake Documentation 함수가 NULL 값을 반환할 수 있거나 NON-NULL 값만 반환해야 할지 여부를 지정합니다. 기본값은 NULL입니다(즉, 이 함수는 NULL을 반환할 수 있음). 참고 현재, NOT NULL 절은 SQL UDFs에 대해 적용되지 않습 docs.snowflake.com https://stackoverflow.com/questions/51743367/hive-create-function-if-not-exists Hive: Create function if not exists At the start of my h..
https://leetcode.com/problems/decode-ways/description/ Decode Ways - LeetCode Level up your coding skills and quickly land a job. This is the best place to expand your knowledge and get prepared for your next interview. leetcode.com https://www.techiedelight.com/ko/count-decodings-sequence-of-digits/ 주어진 숫자 시퀀스의 카운트 디코딩 양수가 주어지면 해당 숫자를 매핑 테이블의 해당 알파벳에 매핑합니다. [(1, 'A'), (2, 'B'), … (26, 'Z')], 가능..
https://simon-aubury.medium.com/kafka-with-avro-vs-kafka-with-protobuf-vs-kafka-with-json-schema-667494cbb2af Kafka with AVRO vs., Kafka with Protobuf vs., Kafka with JSON Schema Experiments with Kafka serialisation schemes — playing with AVRO, Protobuf, JSON Schema in Confluent Streaming Platform. The code for… simon-aubury.medium.com
from inspect import signature def my_func(a, b, c, param_name='apple'): pass value = signature(my_func).parameters['param_name'].default print(value == 'apple') # True value = signature(my_func).parameters['param_name'].default https://stackoverflow.com/questions/12627118/get-a-function-arguments-default-value Get a function argument's default value? For this function def eat_dog(name, should_di..
master region server zookeeper hfile row key memstore wal rolling replay flush https://it-sunny-333.tistory.com/175 [HBase] 데이터 Read/Write 과정 (memstore, WAL, HFile) HDFS는 데이터를 읽을 때 랜덤 엑세스가 아니고 Full scan을 한다. 그리고 update가 불가하며 append만 가능하다. HBase는 HDFS에 있는 데이터를 랜덤 엑세스 하게 해주고 데이터를 update 할 수 있게 해준다. it-sunny-333.tistory.com
https://medium.com/@limgyumin/%EC%BD%94%ED%8B%80%EB%A6%B0-%EC%9D%98-apply-with-let-also-run-%EC%9D%80-%EC%96%B8%EC%A0%9C-%EC%82%AC%EC%9A%A9%ED%95%98%EB%8A%94%EA%B0%80-4a517292df29 코틀린 의 apply, with, let, also, run 은 언제 사용하는가? 원문 : “Kotlin Scoping Functions apply vs. with, let, also, and run” medium.com
https://support.atlassian.com/bitbucket-cloud/docs/set-up-or-run-parallel-steps/ Set up or run parallel steps | Bitbucket Cloud | Atlassian Support In Bitbucket Cloud, parallel steps enable you to build and test faster, by running a set of self-contained steps at the same time. support.atlassian.com
https://dydwnsekd.tistory.com/62 Airflow에서 Jinja template 사용하기 Airflow에서는 Jinja2 template를 내장하고 있어 이를 활용할 수 있는데, Jinja2 template에 대한 자세한 내용은 Jinja Document를 참고하기 바란다. https://jinja.palletsprojects.com/en/3.0.x/ Jinja2 template를 활용할 수 있 dydwnsekd.tistory.com yesterday = date_utils.date_to_string(kwargs['logical_date'].in_timezone("Asia/Seoul")) today = date_utils.add_date(yesterday, 1)
https://stackoverflow.com/questions/49234471/when-to-execute-refresh-table-my-table-in-spark/72970987#72970987 When to execute REFRESH TABLE my_table in spark? Consider a code; import org.apache.spark.sql.hive.orc._ import org.apache.spark.sql._ val path = ... val dataFrame:DataFramew = ... val hiveContext = new org.apache.spark.sql.hive.HiveContext( stackoverflow.com
SELECT column FROM table AS T1 INNER JOIN Params AS P1 ON T1.column LIKE '%' + P1.param + '%'; https://stackoverflow.com/questions/4612282/dynamic-like-statement-in-sql Dynamic Like Statement in SQL I've been racking my brain on how to do this for a while, and i know that some genius on this site will have the answer. Basically i'm trying to do this: SELECT column FROM table WHERE [tabl... stack..
https://stackoverflow.com/questions/42723604/how-to-set-spark-job-staging-location How to set Spark job staging location My spark job is failing because the user doesn't have access to directory where spark is trying to write staging or temp dataset. 2017-03-10 10:25:47,0928 ERROR JniCommon fs/client/fileclient... stackoverflow.com https://doc.hcs.huawei.com/en-us/usermanual/mrs/mrs_03_0298.html..
f"{{{{jinja}}}}_{python}" https://stackoverflow.com/questions/63788781/use-python-f-strings-and-jinja-at-the-same-time Use Python f-strings and Jinja at the same time I am trying to write a concise SQL query string in Python, to make use of both f-strings and Jinja at the same time. Background info: I am writing a query used in Airflow. This did not work: query_... stackoverflow.com
spark.sql("refresh TABLE schema.table") https://stackoverflow.com/questions/73294636/spark-refresh-delta-table-in-s3 Spark: refresh Delta Table in S3 how can I run the refresh table command on a Delta Table in S3? When I do deltatable = DeltaTable.forPath(spark, "s3a://test-bucket/delta_table/") spark.catalog.refreshTable(deltatable) ... stackoverflow.com https://stackoverflow.com/questions/4923..
deactivate https://stackoverflow.com/questions/990754/how-to-leave-exit-deactivate-a-python-virtualenv How to leave/exit/deactivate a Python virtualenv I'm using virtualenv and the virtualenvwrapper. I can switch between virtualenv's just fine using the workon command. me@mymachine:~$ workon env1 (env1)me@mymachine:~$ workon env2 (env2)me@mymachin... stackoverflow.com
SELECT DATE_FORMAT(event_time, "yyyy-MM-dd'T'HH:mm:ss'Z'") event_time https://stackoverflow.com/questions/66679640/convert-timestamp-format-to-iso-time-format-in-pyspark convert timestamp format to iso time format in pyspark I have a pyspark dataframe that has a field, time, that has timestamps in two formats, "11-04-2019,00:32:13" and "2019-12-05T07:57:16.000Z" How can I convert all the timesta..
https://www.jenkins.io/doc/book/pipeline/syntax/#parameters Pipeline Syntax Scripted Pipeline, like Declarative Pipeline, is built on top of the underlying Pipeline sub-system. Unlike Declarative, Scripted Pipeline is effectively a general-purpose DSL built with Groovy. Most functionality provided by the Groovy language is made av www.jenkins.io https://waspro.tistory.com/554 [Jenkins] Pipeline ..
- Total
- Today
- Yesterday
- 김달
- 테슬라 추천
- 테슬라 리퍼럴 코드 생성
- Kluge
- 레퍼럴
- 모델y
- 어떻게 능력을 보여줄 것인가?
- 모델 Y 레퍼럴
- 인스타그램
- 테슬라 리퍼럴 코드
- 테슬라 크레딧 사용
- Bot
- 테슬라 레퍼럴
- wlw
- 할인
- 유투브
- 테슬라 리퍼럴 코드 혜택
- 팔로워 수 세기
- follower
- COUNT
- 개리마커스
- 테슬라
- 테슬라 레퍼럴 적용 확인
- 클루지
- 메디파크 내과 전문의 의학박사 김영수
- 연애학개론
- 테슬라 레퍼럴 코드 확인
- 책그림
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
31 |