2014-09-05 4 views
1

Я хочу написать простой-ish-решение для резервного копирования всех файлов строк, которые мы в настоящее время имеем на ум. Сюда входят любые файлы локали.Curl, чтобы загрузить все файлы с Smartling

На данный момент у меня есть ряд других проблем, но главная цель - загрузить все файлы.

Ток локон для одного файла:

curl -d "apiKey=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx 
&fileUri=%2Fpath%2Fto%2Ffolder%2Fstrings.xml 
&projectId=1234567890" "https://api.smartling.com/v1/file/get/" 

Это поможет мне один файл strings.xml. Я могу дополнительно определить локаль с &locale=DE, чтобы получить тот же файл с переводом.

Могу ли я использовать подстановочный знак для загрузки всех * .xml-файлов?

ответ

0

У Smartling есть сценарий оболочки, который взаимодействует с API-интерфейсом Smrtling, который может выполнять этот тип операции (и многое другое). Допустим, вы хотите, чтобы загрузить все файлы, переведенные файлы вместе с литературными переводами, команда для этого будет выглядеть примерно так:

./download-smartling-files.sh -t published -a xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx -p 1234567890 -l "nl-NL ru-RU" 

Не забудьте обновить локалей -l параметров с локалей сконфигурированных для Smartling проект.

Если вы хотите, чтобы загрузить все файлы, включая любые и все ожидающие переводы, а также опубликованные переводы, то команда будет выглядеть как

./download-smartling-files.sh -t pending -a xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx -p 1234567890 -l "nl-NL ru-RU" 

Обе эти команды будут взаимодействовать с API Smartling к получить список всех файлов, загружаемых в настоящее время в Smartling, а затем перебирать их, загружать переведенные файлы в каталог переведенных/[locale]

Есть много других операций, которые вы можете выполнить с помощью скрипта, включая загрузку файла TMX все ваши переводы. Чтобы понять, как это сделать, а также другие операции, просто запустите

./download-smartling-files.sh 

один другой вариант будет использовать Smartling Maven Plugin. Это обычно используется при включении локализации в процесс сборки и вы хотите загружать и/или загружать файлы во время событий сборки и развертывания.

Вот download-smartling-files.sh скрипт

#!/bin/bash 
# author: Eric Negron ([email protected]) 
# last updated February 20, 2014 

shopt -s nullglob 


function delete_files() { 

    # print the files first to make damn sure they know what they are deleting 
    COUNT=1 
    for CURRENT_FILE in ${uris[@]} 
    do 
     echo $'\n'$COUNT" "$CURRENT_FILE 
     ((COUNT++)) 
     done 

    echo "Are you SURE you want to delete all "${#uris[@]}" files?" 

    # confirm by getting the user to enter 'YES' to input, and if YES then delete these files 
    COUNT=1 
    read delete_confirmation 
    if [ $delete_confirmation != "YES" ]; then 
     echo "sorry, must answer YES to delete. exiting" 
    exit 

    else 
     for CURRENT_FILE in ${uris[@]} 
     do 
      echo $'\n'$COUNT" "$CURRENT_FILE 
      curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&fileUri=$CURRENT_FILE" "https://$SERVER_URL/v1/file/delete" 
      ((COUNT++)) 
     done 
    fi 
} 

function usage { 
cat << EOF 
    This script uses the Smartling API 
    OPTIONS: 
    -h  Show this message 
    -t (REQUIRED) download type. Acceptable types: original | published | p100 | pseudo | pending | TMX | DELETE 
    -a (REQUIRED) API key 
    -p (REQUIRED) project ID 
    -u (OPTIONAL) URI mask 
    -l (OPTIONAL) locales list - a qouted array of Smartling Locales. E.G. "es-ES ja-JP" 
    -x (OPTIONAL) TMX download type: full (default) | published - only valid with -t TMX 
    -d (OPTIONAL) when downloading translations or TMX set this option to false to append filename with locale instead of creating locale folders. Not valid with -t original option. 
    -E (OPTIONAL) DELETE files -E must be true and type = DELETE and must use -u MASK option 
EOF 
} 

# hardcode to regular API, not sandbox 
SERVER_URL=api.smartling.com 

#opt t 
DL_TYPE= 

#opt a 
SL_APIKEY= 

#opt p 
SL_PROJECT= 

#opt u 
URI_MASK= 

#opt u 
LOCALES= 

#opt x 
TMX_FLAG="full" 

#opt d 
USE_LOCALE_DIR="true" 

#opt E 
DELETE_CONFIRM="false" 

while getopts "ht:a:p:u:l:x:d:E:" OPTION 
do 
    case $OPTION in 
     h) 
      usage 
      exit 1 
      ;; 
     t) 
      DL_TYPE=$OPTARG 
      ;; 
     a) 
      SL_APIKEY=$OPTARG 
      ;; 
     p) 
      SL_PROJECT=$OPTARG 
      ;; 
     u) 
      URI_MASK=$OPTARG 
      ;; 
     l) 
      LOCALES=($OPTARG) 
      ;; 
     x) 
      TMX_FLAG=$OPTARG 
      ;; 
     d) 
      USE_LOCALE_DIR=$OPTARG 
      ;; 
     E) 
      DELETE_CONFIRM=$OPTARG 
      ;; 

    esac 
done 

# make sure the required paramaters are set to something 
# TODO check if getops handles this natively in some way 
if [ "$DL_TYPE" == "" ] || [ "$SL_APIKEY" == "" ] || [ "$SL_PROJECT" == "" ]; then 
    usage 
    exit 
fi 

# make sure download type is valid 
if [ "$DL_TYPE" != "original" ]\ 
&& [ "$DL_TYPE" != "published" ]\ 
&& [ "$DL_TYPE" != "p100" ]\ 
&& [ "$DL_TYPE" != "pseudo" ]\ 
&& [ "$DL_TYPE" != "pending" ]\ 
&& [ "$DL_TYPE" != "TMX" ]\ 
&& [ "$DL_TYPE" != "DELETE" ]; then 
echo "INVALID TYPE. Acceptable types: original | published | p100 | pseudo | pending | TMX. Exiting." 
exit 
fi 

# if user has specified mask confirm it - note mask means nothing to TMX 
if [ "$URI_MASK" != "" ]; then 
    if [ $DL_TYPE == "TMX" ]; then 
     echo "Mask option has no effect when downloading TMX. Ignoring." 
     URI_MASK="" 
    else 
     echo "mask was set!"$'\n' 
    fi 
fi 

IFS=$'\n' 

#make sure there are no API configuration errors by doing a file/list call just to check for error conditions 
errors=($(curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&uriMask=$URI_MASK" "https://$SERVER_URL/v1/file/list" | grep -Eo "ERROR")) 
error_count=${#errors[@]}$'\n' 
if [ $error_count -gt 0 ]; then 
    echo "error with either the API key or Project ID - verify your values." 
    exit 
fi 

# figure out how many total files there - because if more than standard 500 returned in list, then need to paginate to build the list of uris 
# need to move this check since if downloading TMX it's possible there are no files. 
total_files=($(curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&uriMask=$URI_MASK" "https://$SERVER_URL/v1/file/list" | grep -Eo "fileCount\":[0-9]+" | sed -e 's/fileCount\"://g')) 


# The /file/list API has a limit of 500 items, if the total needed is more, need to make multiple calls to build the full URI list 

listLimit=500 

# figure out how many passes in batches of 500 needed 

let "passes=$total_files/$listLimit" 

# if more than 1 batch needed, then make as many passes as needed to build the list 

pass=0 

while [ $pass -le $passes ]; do 

let "passOffset=$pass*$listLimit" 

    uris+=($(curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&uriMask=$URI_MASK&offset=$passOffset" "https://$SERVER_URL/v1/file/list" | grep -Eo "fileUri\":\"[^\"]*" | sed -e 's/fileUri":"//g')) 
    ((pass++)) 

done 

# exit if requesting to download files, but nothing matched at all, otherwise show the count of files that matched and the names 
# TODO: refactor since TMX handling is different with getopts 

file_count=${#uris[@]} 
if [ $file_count = "0" ] && [ $DL_TYPE != "TMX" ]; then echo "nothing to operate on, exiting!"; exit; fi 

# if deleting check all the parameters set and if so pass of URI mask list to delete function 
if [ "$DL_TYPE" == "DELETE" ]; then 
    if [ "$DELETE_CONFIRM" == "true" ] && [ "$URI_MASK" != "" ]; then 
     delete_files $uris 
     exit 
    else 
     echo "Must set -u to a uri mask value and must set -E true to confirm deletion. Exiting" 
     exit 
    fi 
fi 

# if not downloading TMX - then list the files we found 
echo "total files to download "$total_files$'\n' 
if [ $DL_TYPE != "TMX" ]; then echo "URIs: "${uris[@]}$'\n'; fi 


#BEGIN ORIGINALS 
if [ "$DL_TYPE" == "original" ] ; then 
    echo "downloading originals" 
    echo "files to download: "$file_count$'\n' 

# create the originals folder if it doesn't exist 
    if [ ! -d "originals" ]; then 
     echo "making originals folder"$'\n' 
     mkdir originals 
    fi 

    COUNT=0 
    # go through the list of files to download (in each language) 
    for CURRENT_FILE in ${uris[@]} 
    do 
     echo $CURRENT_FILE 
     ((COUNT++)) 
     echo $COUNT 

     # since the URI includes full URI including default smartling prefix '/files/' or any other prefix specified - strip that and just use the last part after the last/
     # e.g. if the URI is /files/filename.ext then BF_NAME will be filename.ext - this what we use for the local file name (in the folder) 
     BF_NAME=${CURRENT_FILE##*/} 
     # This is curl call that downloads the originals. Since no LOCALE is set that is the behavior of /file/get 

     # becuase we are flattening the URIs to basename it's possible filenames could be duplicated - so to avoid this check if the file already exist and if it does use count to append name to create unique 
     # TODO USE COUNT 
     if [ -e originals/$BF_NAME ]; then 
      BF_NAME=$COUNT.$BF_NAME 
      echo "Updated BF_NAME" 
     fi 

     curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&fileUri=$CURRENT_FILE" "https://$SERVER_URL/v1/file/get" > originals/$BF_NAME 
     echo "downloaded URI: "$CURRENT_FILE" to: originals/"$BF_NAME$'\n' 

    done 
exit 
fi 
#end of originals 


# if not getting originals - then getting translations OR TMX - so get locales and then use that to download all 
# get the list of locales for this project. Similar to above grep but the locale key value then sed for just the value 

# if user has not set the locales array in the call, then get the full list via API 
# TODO - validate the locales are good - otherwise user will just get empty files 
if [ "$LOCALES" = "" ]; then 
    LOCALES=($(curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT" "https://$SERVER_URL/v1/project/locale/list" | grep -Eo "locale\":\"[^\"]*" | sed -e 's/locale":"//g')) 
fi 

echo "locales: "${#LOCALES[@]}$'\n' 
# count of the locales 
echo ${LOCALES[@]}$'\n' 

#script puts the translated' versions in 'translated folder' with sub-folders for locales 
if [ ! -d "translated" ]; then 
    echo "making translated folder"$'\n' 
    mkdir translated 
fi 

if [ $USE_LOCALE_DIR == "true" ]; then 
#if [ "$USE_LOCALE_DIR" == "true" ]; then 
    # make the subfolders in the translated folder if they don't exist 
    for CURRENT_LOCALE in ${LOCALES[@]} 
     do 
     if [ ! -d translated/$CURRENT_LOCALE ]; then 
      echo "making "$CURRENT_LOCALE" folder" $'\n' 
      mkdir translated/$CURRENT_LOCALE 
     fi 
    done 
fi 

#TODO need to refactor TMX/MASK/PUBLISHED/FULL 
# Download TMX 
# Downloading TMX uses a different API than downloading tranlated files 
if [ "$DL_TYPE" == "TMX" ] ; then 
    echo "downloading TMX "$TMX_FLAG 
    for CURRENT_LOCALE in ${LOCALES[@]} 
    do 
     # if using folders - set the folder locale 
     # else using file-names so set the filename locale 
     if [ "$USE_LOCALE_DIR" == "true" ]; then 
      LOCALE_DIR=$CURRENT_LOCALE"/" 
      FILE_LOCALE="TMX-" 
     else 
      FILE_LOCALE=$CURRENT_LOCALE"-" 
      LOCALE_DIR="" 
     fi 

     curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&locale=$CURRENT_LOCALE&format=TMX&dataSet=$TMX_FLAG" "https://$SERVER_URL/v1/translations/download" > translated/$LOCALE_DIR$FILE_LOCALE"TMX.xml" 
     echo "downloaded " $CURRENT_LOCALE "TMX" $'\n' 
    done 

    exit 
fi 


# downloading translations of files 
# by this point the file list has already been filtered and the and the locale list has been created 
# go through the list of files to download (in each language) 

COUNT=0 
P100=0 

# since I use DL_TYPE as actual parameter to API call need to set this back to actual "published" before entering loop to download 
if [ $DL_TYPE == "p100" ]; then 
    DL_TYPE="published" 
    P100=1 
fi 


for CURRENT_FILE in ${uris[@]} 
do 
    echo $CURRENT_FILE 
    ((COUNT++)) 
    echo $COUNT 
    # loop through all the locales for each file 

    for CURRENT_LOCALE in ${LOCALES[@]} 
    do 

    # if using folders - set the folder locale 
    # else using file-names so set the filename locale 
    if [ $USE_LOCALE_DIR == "true" ]; then 
     LOCALE_DIR=$CURRENT_LOCALE"/" 
     FILE_LOCALE="" 
    else 
     FILE_LOCALE=$CURRENT_LOCALE"-" 
     LOCALE_DIR="" 
    fi 

     # since the URI includes full URI including default smartling prefix '/files/' or any other prefix specified - strip that and just use the last part after the last/
     # e.g. if the URI is /files/filename.ext then BF_NAME will be filename.ext - this what we use for the local file name (in the locale folder) 
     BF_NAME=${CURRENT_FILE##*/} 

     EXT=${BF_NAME##*.} 
     # rename .pot files to .po - we really should be checking the header from the /file/get API call but this is just simpler 
     if [ "$EXT" == "pot" ]; then 
      filename=${BF_NAME%.*} 
      BF_NAME=$filename".po" 
     echo "renamed pot to po" 
     fi 

     # because we are flattening the URIs to basename it's possible filenames could be duplicated - so to avoid this check if the file already exist and if it does use COUNT to append name to create unique 
     if [ -e translated/$LOCALE_DIR$FILE_LOCALE$BF_NAME ]; then 
      BF_NAME=$COUNT.$BF_NAME 
      echo "Updated BF_NAME" 
     fi 

# Here if user is doing 100% published need to test and if it's not then skip it 
# probably should do this earlier in this loops to avoid other tests but thare not needed if we are just going to skip the file 

if [ $P100 == 1 ]; then 

# echo "requested 100 percent only" 

    # use api status call and pull out the part of the response that has the string count and completed string count 
    completeStatus=($(curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&fileUri=$CURRENT_FILE&locale=$CURRENT_LOCALE" "https://$SERVER_URL/v1/file/status" | grep -Eo "stringCount\":[0-9]+.*\"completedStringCount\":[0-9]+")) 

    # strip those out as separate variables to test 
    stringCount=($(echo $completeStatus | sed -E 's/stringCount\":([0-9]+).*/\1/g')) 
    completedStringCount=($(echo $completeStatus | sed -E 's/.*completedStringCount\":([0-9]+).*/\1/g')) 
#echo $stringCount","$completedStringCount 

    # set download to 1 only if 100% 
    if [ $stringCount == $completedStringCount ]; then 
     download=1 
     echo $CURRENT_FILE" in "$CURRENT_LOCALE" is 100% complete!"$'\n' 

    else 
     download=0 
     echo "100% complete requested. Skipping URI: "$CURRENT_FILE" in "$CURRENT_LOCALE" total:"$stringCount", completed:"$completedStringCount$'\n' 
    fi 

else 
    # didn't ask for 100% so always download 
    download=1 

fi 


if [ $download == 1 ]; then 
     # This is the actual curl command that downloads the given file for given locale, and state as specified when called 
     curl -sS -d "apiKey=$SL_APIKEY&projectId=$SL_PROJECT&fileUri=$CURRENT_FILE&locale=$CURRENT_LOCALE&retrievalType=$DL_TYPE" "https://$SERVER_URL/v1/file/get" > translated/$LOCALE_DIR$FILE_LOCALE$BF_NAME 
     echo "downloaded URI: "$CURRENT_FILE" to: translated/"$LOCALE_DIR$FILE_LOCALE$BF_NAME$'\n' 
fi 


    done 

done 

exit 
+0

Это выглядит здорово, но где я могу получить мои руки «download-smartling-files.sh»? – TheMightyLlama

+0

Хорошо, я нашел это [здесь] (http://www.google.fr/url?sa=t&rct=j&q=&esrc=s&source=web&cd=4&ved=0CDcQFjAD&url=http%3A%2F%2Fsupport.smartling.com% 2Fattachments% 2Ftoken% 2Fnajtw67chvl0pue% 2F% 3Fname% 3Ddownload-smartling-files.sh & е = c1YNVPOcGZDcau35guAE & USG = AFQjCNF83BJ8V7Pa88prJ1D5wwP1UuXiAg & Sig2 = ii4gac0fuVuNSm1tiL3LVg & BVM = bv.74649129, d.d2s) – TheMightyLlama

+0

и я изменил его немного, чтобы проверить Glob (независимо от того, что есть), и обновлен вызовите parallel -g, который устарел. [Здесь] (https://gist.github.com/TheMightyLlama/106ad6c8bdfbd740c496). Обратите внимание, что вам необходимо установить Parallel. – TheMightyLlama

Смежные вопросы