2012-09-01 53 views
4

我有一個Job模型,它可以有很多附件。 Attachment型號上安裝了CarrierWave上傳器。克隆記錄並將遠程文件複製到新位置?

class Job < ActiveRecord::Base 
    has_many :attachments 
end 

class Attachment < ActiveRecord::Base 
    mount_uploader :url, AttachmentUploader 

    belongs_to :job 
end 

作業可以克隆和克隆作業應創建新的作業和附件記錄。這部分很簡單。

然後系統需要將物理文件複製到與克隆作業相關的上傳位置。
有沒有一種簡單的方法來與CarrierWave做到這一點?該解決方案應支持本地文件系統和AWS S3。

class ClonedJob 
    def self.create_from(orig_job) 
    @job_clone = orig_job.dup 

    if orig_job.attachments.any? 
     orig_job.attachments.each do |attach| 
     cloned_attactment = attach.dup 
     # Need to physically copy files at this point. Otherwise 
     # this cloned_attachment will still point to the same file 
     # as the original attachment. 
     @job_clone.attachments << cloned_attachment 
     end 
    end 
    end 
end 

回答

2

我粘貼在模塊下面,我一起來完成這個。它可以工作,但如果足夠重要的話,我仍然會改進一些東西。我只是把我的想法留在代碼中。

require "fileutils" 

# IDEA: I think it would make more sense to create another module 
# which I could mix into Job for copying attachments. Really, the 
# logic for iterating over attachments should be in Job. That way, 
# this class could become a more generalized class for copying 
# files whether we are on local or remote storage. 
# 
# The only problem with that is that I would like to not create 
# a new connection to AWS every time I copy a file. If I do then 
# I could be opening loads of connections if I iterate over an 
# array and copy each item. Once I get that part fixed, this 
# refactoring should definitely happen. 

module UploadCopier 
    # Take a job which is a reprint (ie. it's original_id 
    # is set to the id of another job) and copy all of 
    # the original jobs remote files over for the reprint 
    # to use. 
    # 
    # Otherwise, if a user edits the reprints attachment 
    # files, the files of the original job would also be 
    # changed in the process. 
    def self.copy_attachments_for(reprint) 
    case storage 
    when :file 
     UploadCopier::LocalUploadCopier.copy_attachments_for(reprint) 
    when :fog 
     UploadCopier::S3UploadCopier.copy_attachments_for(reprint) 
    end 
    end 

    # IDEA: Create another method which takes a block. This method 
    # can check which storage system we're using and then call 
    # the block and pass in the reprint. Would DRY this up a bit more. 

    def self.copy(old_path, new_path) 
    case storage 
    when :file 
     UploadCopier::LocalUploadCopier.copy(old_path, new_path) 
    when :fog 
     UploadCopier::S3UploadCopier.copy(old_path, new_path) 
    end 
    end 

    def self.storage 
    # HACK: I should ask CarrierWave what method to use 
    # rather than relying on the config variable. 
    APP_CONFIG[:carrierwave][:storage].to_sym 
    end 

    class S3UploadCopier 
    # Copy the originals of a certain job's attachments over 
    # to a location associated with the reprint. 
    def self.copy_attachments_for(reprint) 
     reprint.attachments.each do |attachment| 
     orig_path = attachment.original_full_storage_path 
     # We can pass :fog in here without checking because 
     # we know it's :fog since we're in the S3UploadCopier. 
     new_path = attachment.full_storage_path 
     copy(orig_path, new_path) 
     end 
    end 

    # Copy a file from one place to another within a bucket. 
    def self.copy(old_path, new_path) 
     # INFO: http://goo.gl/lmgya 
     object_at(old_path).copy_to(new_path) 
    end 

    private 

    def self.object_at(path) 
     bucket.objects[path] 
    end 

    # IDEA: THis will be more flexible if I go through 
    # Fog when I open the connection to the remote storage. 
    # My credentials are already configured there anyway. 

    # Get the current s3 bucket currently in use. 
    def self.bucket 
     s3 = AWS::S3.new(access_key_id: APP_CONFIG[:aws][:access_key_id], 
     secret_access_key: APP_CONFIG[:aws][:secret_access_key]) 
     s3.buckets[APP_CONFIG[:fog_directory]] 
    end 
    end 

    # This will only be used in development when uploads are 
    # stored on the local file system. 
    class LocalUploadCopier 
    # Copy the originals of a certain job's attachments over 
    # to a location associated with the reprint. 
    def self.copy_attachments_for(reprint) 
     reprint.attachments.each do |attachment| 
     # We have to pass :file in here since the default is :fog. 
     orig_path = attachment.original_full_storage_path 
     new_path = attachment.full_storage_path(:file) 
     copy(orig_path, new_path) 
     end 
    end 

    # Copy a file from one place to another within the 
    # local filesystem. 
    def self.copy(old_path, new_path) 
     FileUtils.mkdir_p(File.dirname(new_path)) 
     FileUtils.cp(old_path, new_path) 
    end 
    end 
end 

我用這樣的:

# Have to save the record first because it needs to have a DB ID. 
if @cloned_job.save 
    UploadCopier.copy_attachments_for(@cloned_job) 
end 
0
class Job < ActiveRecord::Base 
    has_many :attachments 
end 

class Attachment < ActiveRecord::Base 
    mount_uploader :attachment, AttachmentUploader 
    belongs_to :job 
end 

class ClonedJob 
    def self.create_from(orig_job) 
    @job_clone = orig_job.dup 

    if orig_job.attachments.any? 
     orig_job.attachments.each do |attach| 
     cloned_attachment = attach.dup 
     @job_clone.attachments << cloned_attachment 
     # !!! Here is the trick 
     cloned_attachment.remote_attachment_url = attach.attachment_url 
     end 
    end 
    end 
end 
+1

見http://stackoverflow.com/questions/20361702/carrierwave-creating-a-duplicate-attachment-when-duplicating-its -containing-mod也使用'remote _ * _ url' –

相關問題