I have a very strange problem. I use a Dell Precision M4500 Notebook with NVIDIA Quadro FX 880M Video card. I just recently switched from Windows 7 to Linux, after using Windows for 20 years. So I'm a complete Linux beginner. I started with Mint 17.2 Xfce 64 Bit, installed the proprietary NVIDIA driver 340.65 and made some minor changes to get OpenCL recognized by Darktable. Then I processed some RAW images with OpenCL support and all was working.
Because I wanted a rolling release, I changed last week to LMDE 2 Cinnamon 64 Bit. Again I installed the proprietary NVIDIA driver 340.65 and installed some additional packages to get OpenCL recognized by Darktable. And now the very strange problem: a soon as Darktable begins to process RAW images, X freezes and dmesg reports plenty of "NVRM: os_schedule: Attempted to yield the CPU while in atomic or interrupt context" every 2 seconds and finally "NMI watchdog: BUG: soft lockup - CPU#1 stuck for 23s!". Sometimes X gets terminated and I get logged out. In the hope the latest video driver would make a change I updated to NVIDIA driver 340.93, but still the same behavior. When I turn off OpenCL support in Darktable all works well.
So I wonder if there is any difference in the NVIDIA driver between Ubuntu (on which Mint 17.2 is based) and Debian (on which LMDE 2 is based)? Or is it because I use Cinnamon instead of Xfce? Is there anything I can configure or install to get OpenCL working with Darktable or do I have to switch to Mint 17.2 Cinnamon to get it working.
Here some informations about my system:
I appreciate any help. Thanks in advance!
Code: Select all
System: Host: xxxxxxx Kernel: 3.16.0-4-amd64 x86_64 (64 bit gcc: 4.8.4) Desktop: Cinnamon 2.6.13 (Gtk 2.24.25-3) Distro: LinuxMint 2 betsy Machine: System: Dell product: Precision M4500 v: 0001 Mobo: Dell model: 0RRH3K v: A01 Bios: Dell v: A15 date: 12/05/2013 CPU: Dual core Intel Core i5 M 540 (-HT-MCP-) cache: 3072 KB flags: (lm nx sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx) bmips: 10107 Clock Speeds: 1: 1199 MHz 2: 2399 MHz 3: 1199 MHz 4: 1733 MHz Graphics: Card: NVIDIA GT216GLM [Quadro FX 880M] bus-ID: 01:00.0 Display Server: X.Org 1.16.4 driver: nvidia Resolution: firstname.lastname@example.org GLX Renderer: Quadro FX 880M/PCIe/SSE2 GLX Version: 3.3.0 NVIDIA 340.93 Direct Rendering: Yes Audio: Card-1 Intel 5 Series/3400 Series High Definition Audio driver: snd_hda_intel bus-ID: 00:1b.0 Card-2 NVIDIA GT216 HDMI Audio Controller driver: snd_hda_intel bus-ID: 01:00.1 Sound: Advanced Linux Sound Architecture v: k3.16.0-4-amd64 Network: Card-1: Intel Centrino Advanced-N 6200 driver: iwlwifi v: in-tree: bus-ID: 03:00.0 IF: wlan0 state: up mac: <filter> Card-2: Intel 82577LM Gigabit Network Connection driver: e1000e v: 2.3.2-k port: 8040 bus-ID: 00:19.0 IF: eth0 state: down mac: <filter> Drives: HDD Total Size: 500.1GB (33.7% used) ID-1: /dev/sda model: Samsung_SSD_840 size: 500.1GB Partition: ID-1: / size: 29G used: 7.1G (26%) fs: ext4 dev: /dev/sda2 ID-2: /home size: 403G used: 143G (38%) fs: ext4 dev: /dev/sda3 ID-3: swap-1 size: 8.39GB used: 0.00GB (0%) fs: swap dev: /dev/sda1 Sensors: System Temperatures: cpu: 48.0C mobo: N/A gpu: 0.0:59C Fan Speeds (in rpm): cpu: N/A Info: Processes: 201 Uptime: 3:47 Memory: 1535.8/7993.6MB Init: SysVinit runlevel: 2 Gcc sys: 4.9.2 Client: Shell (bash 4.3.301) inxi: 2.1.28
P.S. English is not my native language, so most probably I have spelling and grammar mistakes.