2016-07-29 32 views
0

爲我的數據庫大小已接近150MB(Erlang進程崩潰)我遇到了內存問題提高備份腳本中使用更少的內存

下面是我當前的備份腳本。有關如何改善這一點的任何建議,所以我不加載整個備份到內存中,而是直接流到S3?

defmodule Backup do 
    require Logger 
    alias MyApp.{ Repo, BackupUploader, S3 } 

    @database System.get_env("DATABASE_URL") 
    @bucket Application.get_env(:arc, :bucket) 
    @folder "backups" 

    def start do 
    Logger.info "*** Initiating database backup ***" 
    backup = %BackupRequest{} 

    backup 
    |> dump_database 
    |> upload_to_s3 
    end 

    defp dump_database(%BackupRequest{} = backup) do 
    Logger.info "*** Dumping database ***" 
    command = "pg_dump" 
    args = [@database] 
    {data, 0} = System.cmd command, args 

    %{backup | data: data, status: "dumped"} 
    end 

    defp upload_to_s3(%BackupRequest{data: data} = backup) do 
    Logger.info "*** Uploading to S3 bucket ***" 
    key = get_s3_key 
    ExAws.S3.put_object!(@bucket, key, data) 

    Logger.info "*** Backup complete ***" 
    end 

    # Helpers 
    # 
    # 

    defp get_s3_key do 
    {{year, month, day}, {hour, minute, _seconds}} = :os.timestamp |> :calendar.now_to_datetime 

    hash = SecureRandom.urlsafe_base64(32) 
    date = "#{day}-#{month}-#{year}-#{hour}:#{minute}" 
    key = @folder <> "/#{date}_#{hash}_#{Mix.env}" 

    key 
    end 

end 

回答