I have an ipart I'd like to create w/ surfaces. The surface color is important because we use that as plate thickness ID in FEA. When I color a surface blue, then another red I continually get the default color when using the mirror or array command. We're creating a customizeable ipart where there will be some intelligence surrounding how big the array (we may need 3, 4, or 5 beams based on the unit size). I need a way that the colors on the surface propogate to the new surfaces created by the commands.
I'm trying to change the dafault varible for the mirror command from no to yes. whne you do the mirror command after you choose your second point it asks if you want to erase the source object and the default is set to no but I would like to change it to yes. I'm able to change the macro for the mirror button which works but only when I push the button. Before the cui file I remember everything was done from the acad.pgp or the acad.mnu files but with the new CUI file i can't seem to locate wher to change the actual command and not just the macro for the mirror button.
I am getting these white "ghost" boxes with every copy, move or mirror command. They can be individually deleted but that takes too long. The more there are the slower each command takes, to the point a copy can take over a minute. If I save and close then when I reopen they are gone -- until I execute a command. This is unique to a single file I recieved from a friend using a PC based Autocad; I use Autocad for Mac. I had no trouble with his file until I did a "Save As" and used it as a base for a new file. Now, every file that is a copy of it, even any file in which I x-ref it, has the same problem. All of my original, older files are fine. There is simply too much work in these (now) five files to start over. See attached.
Why can't I get the "3DMIRROR" command to mirror the magenta box along a zx-plane defined by the middle of the steel T-angle?
First, I use the ZX option, and after that fails to work, I try the 3-point option using a picked first point and two ortho-points to define a zx-plane.
Also, there's some randomness in the sense that some of the time the exact same series of steps will produce the correct result, but not this time, as the video below shows.
[URL]....
I'm also attaching the drawing in AutoCAD2010 format.
still on this family quality control audit... i've come across some new and entertaining behavior.
several fmailies (face based) flip host sides with their flip controls as expected, but if i mirror them , instead of flipping along the mirror line as an accurate reflection, they ALSO flip top to bottom, AND around the face.
the right family in both views is the original, the left is the new mirror. (plan and elevation views in wire frame included) you can see it landed within the wall, and upsidedown.
In CS2 I am trying to resave a load of .jpg files but they have long file names and every time CS2 will shorten to filenames to make them compatible with mac users.
It tells me to go to the optimise pop up box and change the options in their ...
I am following all instructions that I can find to optimize an image for the web. I'm using CS3.. Either I get results that have too long a download time or the quality is not good enough. I tried using different settings. I tried making image as small as possible. Also, I thought if it was a JPEG before changing settings it would work.
I've made a model I'm content with - editable poly, and have added a UVW Map modifier and Unwrap UVW and subsequently rendered the UVW, taken it into photoshop and done what I would like to it then brought it back in as a diffuse material - all worked fine.I'm going to be using the asset in a prototype game so want to decrease the number of triangles rendered. So to decrease the polygons I've added an Optimise modifier. This is great for decreasing the number of polygons, but unfortunately as it changes the underlying geometry, it appears is mucking up the UVW coordinates and my material is all over the place.
What is the correct process or technique to decrease the polygons and preserve the UVW map appearance on the object, which was set on the higher res version? I've tried changing the modifyer stack order but as the shape has a number of irregular surfaces, the unwrapped UVW in peel mode is a real mess and would be extremely difficult for me to texture with any consistency.
I've used Lightroom since the beginning but am having issues with LR4. One of the problems I am facing is that when I backup my LR database that it brings my entire system to a complete hault for over an hour. It seems to do that during the optimize catalog phase. I can't do anything else on my computer as my mouse and everything else freezes until the very slow backup completes. I didn't have this issue with any other version of LR. I first had this problem on my computer after upgrading my LR3. When I encountered this and other problems (mostly related to locking up or working very slow) I started with a fresh hard drive and re-installed windows and LR but still had the same problems.
I'm using Windows Vista 64, 8gb of RAM, 2.83 Ghz, a fast drive with loads of extra space, etc. I do have a large LR database (7gb) but I had that in LR3 as well.
I can't optimize my main catalog any more. LR5 sends the error message that I have to restart it in order to check the integrity of the catalog. It then restarts without problems, but I face the same error again when trying to optimize.
Importing the old catalog or a copy of it into a newly created one does not work at all, LR states that it can not created a temporary copy necessary for this. Changing directories or drives does not work either.
The catalog seems to work fine once I open it, but I can't check fpr its integrity,
I just built a new editing system and am exporiting a new project and my cpu sits at 58-65% usage and gpu at 2-6%. Ram usage is about 8GB. The file is from dslr .MOV clips to H.264 MP4 export, 1080P 30FPS 20Mbps CBR. Total lenght 6min 20 sec with warp stabilizer, contrast, sharpening filter, and neat video applied.As far as I know all of those effects are CUDA accelerated.Media is on my Raid 0 Segregate 3TB x2 drive (about 350 Read/Write) Scratch is on 120gb ssd drive, OS and Adobe CC on 240GB SSD. Both ssd's 500 read/write. Exporting to either os drive or raid 0 has no difference. Export time is about 25 min.
Are there any optimizations I need to do to speed things up or is this normal? I ran PPBM7 yesterday and sent in the info. I got the following results from the output file,
I/O time = 111 Sec 334.16 MB/S , h.264 Timeline = 208 Seconds, MPE gain 461/18=25.6 .
Below are PC specs. Everything is stock, not overclocked yet.
4930k, gtx titan black , 32gb ram 1866 , 650watt 80gold psu, 240 ssd os, 120 Ssd scratch, 6tb raid 0 for media, win 8.1 , premeire CC.
I want to optimize the images in my blog posts since it is loading very slowly. Due to this my blogs loading speed is very high. I have no idea to optimize the images in Photoshop. You can see my images loading very slowly here
I have hundreds of web galleries that I need optimized. I need each photo opened and resaved. However, I need the photos to be saved via "save for web" so that the files end up being about 20-30 k. If I use just "save as > jpg" the files end up being about 100-130k. IMPORTANT NOTES: I want the current files to be overwritten (partly because, if I designate a destination, they will be unorganized, and partly because each photo comes in 4 diferent versions: bwlrg, bwsml, clrlrg, clrsml.
Is there a plugin or macro available that automatically moves objects around to optimize space ? I think it's called nesting by other industries, such as vinyl cutting, lasering, and signs. Basically, you end up with as little white space as possible.
I discovered, with horror, that the catalog HAS CREATED VIRTUAL COPIES FOR EVERY SINGLE ONE OF THE 15,000+ photos in this catalog. I wanted to RELAUNCH and OPTIMIZE it, but the button is missing! All I see under GENERAL in Catalog Settings is the name and location of the catalog, date, size and HOW to backup.
way in photoshop to optimize a bunch of tiff files in a folder to jpegs at the same time. I have about 260 tiff images different sizes I need to optimize them to jpegs.
I have many images of slides scanned at high res (4800 DPI, maximum pixels 5214x3592). Although I will be saving these as loss less TIFs, I also wish to make JPGs from them that I wish to be just less than 5 MB in file size. Aside from cropping, I know I can achieve such a reduction of JPG file size by a combination of saving to lower quality JPG compression or reducing image size. My question is, what is theoretically or practically better, achieving this mostly by reducing image total pixels or by reducing JPG compression quality.
In the weldment of a crane boom i have different parts. Some are identical, but (1:1) mirrored.ipt. Then I merge them in the BOM in order to set the qty on "2" in the partslist.
Here the error occurs: in the "mass" field in the parts list says "varies".
This keeps me from generating a good partslist and the total mass of the weldment maybe messed up. I personally think it has something to do with the fact that the parts are sheetmetals. i.e when I look atthe Physical properties of the folded / unfolded (of the SAME .ipt) they differ in the same way i see in the BOM.
when I was test-rendering a scene where you are supposed to be able to look into nearby rooms through glass walls. But the glass reflected the surroundings like å mirror. In some tries i could barely see some furniture in the other room, so the transparency isn't completely gone.
I use a solid glass material, at first i thought the material's reflection value was too high, but it was set on five, I tried to reduce it to 1, but the rendering came out just the same.
I have tried with different lighting options, but again... the same miserable result.
AutoCad 2011. I am having no luck when trying to drag a command from the Command List pane to the Quick Access Toolbar. I have success draging to Tool Palettes
I am currently writing a series of routines for setting the layers for text, leaders and dimension commands. The end goal is a system where any annotation command sets the correct layer for the duration of the command, then reverts back to the layer that was active before the command.
I have managed to complete all the code, and it appears to be working fine, I just have one question: I have used -layer "m" "Lay_name" etc... for all layer setting commands, rather than any code to see if the layer exists already. In my limited testing this seems to be suitable, nothing that exists on that layer seems to be affected.
I know how to write code to determine if the layer exists already and set the layer instead, but so far it seems unnecessary??